WorldWideScience

Sample records for quantifying aggregated uncertainty

  1. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    Science.gov (United States)

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  2. Hump-shape Uncertainty, Agency Costs and Aggregate Fluctuations

    OpenAIRE

    Lee, Gabriel; Kevin, Salyer; Strobel, Johannes

    2016-01-01

    Previously measured uncertainty shocks using the U.S. data show a hump-shape time path: Uncertainty rises for two years before its decline. Current literature on the effects uncertainty on macroeconomics, including housing, has not accounted for this observation. Consequently, the literature on uncertainty and macroeconomics is divided on the effcts and the propagation mechanism of uncertainty on aggregate uctuations. This paper shows that when uncertainty rises and falls over time, th...

  3. Quantifying the uncertainty in heritability.

    Science.gov (United States)

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  4. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  5. Macro Expectations, Aggregate Uncertainty, and Expected Term Premia

    DEFF Research Database (Denmark)

    Dick, Christian D.; Schmeling, Maik; Schrimpf, Andreas

    2013-01-01

    as well as aggregate macroeconomic uncertainty at the level of individual forecasters. We find that expected term premia are (i) time-varying and reasonably persistent, (ii) strongly related to expectations about future output growth, and (iii) positively affected by uncertainty about future output growth...... and in ation rates. Expectations about real macroeconomic variables seem to matter more than expectations about nominal factors. Additional findings on term structure factors suggest that the level and slope factor capture information related to uncertainty about real and nominal macroeconomic prospects...

  6. Aggregate Uncertainty, Money and Banking

    OpenAIRE

    Hongfei Sun

    2006-01-01

    This paper studies the problem of monitoring the monitor in a model of money and banking with aggregate uncertainty. It shows that when inside money is required as a means of bank loan repayment, a market of inside money is entailed at the repayment stage and generates information-revealing prices that perfectly discipline the bank. The incentive problem of a bank is costlessly overcome simply by involving inside money in repayment. Inside money distinguishes itself from outside money by its ...

  7. A Novel Method to Quantify Soil Aggregate Stability by Measuring Aggregate Bond Energies

    Science.gov (United States)

    Efrat, Rachel; Rawlins, Barry G.; Quinton, John N.; Watts, Chris W.; Whitmore, Andy P.

    2016-04-01

    Soil aggregate stability is a key indicator of soil quality because it controls physical, biological and chemical functions important in cultivated soils. Micro-aggregates are responsible for the long term sequestration of carbon in soil, therefore determine soils role in the carbon cycle. It is thus vital that techniques to measure aggregate stability are accurate, consistent and reliable, in order to appropriately manage and monitor soil quality, and to develop our understanding and estimates of soil as a carbon store to appropriately incorporate in carbon cycle models. Practices used to assess the stability of aggregates vary in sample preparation, operational technique and unit of results. They use proxies and lack quantification. Conflicting results are therefore drawn between projects that do not provide methodological or resultant comparability. Typical modern stability tests suspend aggregates in water and monitor fragmentation upon exposure to an un-quantified amount of ultrasonic energy, utilising a laser granulometer to measure the change in mean weight diameter. In this project a novel approach has been developed based on that of Zhu et al., (2009), to accurately quantify the stability of aggregates by specifically measuring their bond energies. The bond energies are measured operating a combination of calorimetry and a high powered ultrasonic probe, with computable output function. Temperature change during sonication is monitored by an array of probes which enables calculation of the energy spent heating the system (Ph). Our novel technique suspends aggregates in heavy liquid lithium heteropolytungstate, as opposed to water, to avoid exposing aggregates to an immeasurable disruptive energy source, due to cavitation, collisions and clay swelling. Mean weight diameter is measured by a laser granulometer to monitor aggregate breakdown after successive periods of calculated ultrasonic energy input (Pi), until complete dispersion is achieved and bond

  8. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  9. Quantifying uncertainty in nuclear analytical measurements

    International Nuclear Information System (INIS)

    2004-07-01

    The lack of international consensus on the expression of uncertainty in measurements was recognised by the late 1970s and led, after the issuance of a series of rather generic recommendations, to the publication of a general publication, known as GUM, the Guide to the Expression of Uncertainty in Measurement. This publication, issued in 1993, was based on co-operation over several years by the Bureau International des Poids et Mesures, the International Electrotechnical Commission, the International Federation of Clinical Chemistry, the International Organization for Standardization (ISO), the International Union of Pure and Applied Chemistry, the International Union of Pure and Applied Physics and the Organisation internationale de metrologie legale. The purpose was to promote full information on how uncertainty statements are arrived at and to provide a basis for harmonized reporting and the international comparison of measurement results. The need to provide more specific guidance to different measurement disciplines was soon recognized and the field of analytical chemistry was addressed by EURACHEM in 1995 in the first edition of a guidance report on Quantifying Uncertainty in Analytical Measurements, produced by a group of experts from the field. That publication translated the general concepts of the GUM into specific applications for analytical laboratories and illustrated the principles with a series of selected examples as a didactic tool. Based on feedback from the actual practice, the EURACHEM publication was extensively reviewed in 1997-1999 under the auspices of the Co-operation on International Traceability in Analytical Chemistry (CITAC), and a second edition was published in 2000. Still, except for a single example on the measurement of radioactivity in GUM, the field of nuclear and radiochemical measurements was not covered. The explicit requirement of ISO standard 17025:1999, General Requirements for the Competence of Testing and Calibration

  10. Three-dimensional laser scanning technique to quantify aggregate and ballast shape properties

    CSIR Research Space (South Africa)

    Anochie-Boateng, Joseph

    2013-06-01

    Full Text Available methods towards a more accurate and automated techniques to quantify aggregate shape properties. This paper validates a new flakiness index equation using three-dimensional (3-D) laser scanning data of aggregate and ballast materials obtained from...

  11. GRAPH THEORY APPROACH TO QUANTIFY UNCERTAINTY OF PERFORMANCE MEASURES

    Directory of Open Access Journals (Sweden)

    Sérgio D. Sousa

    2015-03-01

    Full Text Available In this work, the performance measurement process is studied to quantify the uncertainty induced in the resulting performance measure (PM. To that end, the causes of uncertainty are identified, analysing the activities undertaken in the three following stages of the performance measurement process: design and implementation, data collection and record, and determination and analysis. A quantitative methodology based on graph theory and on the sources of uncertainty of the performance measurement process is used to calculate an uncertainty index to evaluate the level of uncertainty of a given PM or (key performance indicator. An application example is presented. The quantification of PM uncertainty could contribute to better represent the risk associated with a given decision and also to improve the PM to increase its precision and reliability.

  12. Fuzzy probability based fault tree analysis to propagate and quantify epistemic uncertainty

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Ekariansyah, Andi Sofrany; Tjahjono, Hendro

    2015-01-01

    Highlights: • Fuzzy probability based fault tree analysis is to evaluate epistemic uncertainty in fuzzy fault tree analysis. • Fuzzy probabilities represent likelihood occurrences of all events in a fault tree. • A fuzzy multiplication rule quantifies epistemic uncertainty of minimal cut sets. • A fuzzy complement rule estimate epistemic uncertainty of the top event. • The proposed FPFTA has successfully evaluated the U.S. Combustion Engineering RPS. - Abstract: A number of fuzzy fault tree analysis approaches, which integrate fuzzy concepts into the quantitative phase of conventional fault tree analysis, have been proposed to study reliabilities of engineering systems. Those new approaches apply expert judgments to overcome the limitation of the conventional fault tree analysis when basic events do not have probability distributions. Since expert judgments might come with epistemic uncertainty, it is important to quantify the overall uncertainties of the fuzzy fault tree analysis. Monte Carlo simulation is commonly used to quantify the overall uncertainties of conventional fault tree analysis. However, since Monte Carlo simulation is based on probability distribution, this technique is not appropriate for fuzzy fault tree analysis, which is based on fuzzy probabilities. The objective of this study is to develop a fuzzy probability based fault tree analysis to overcome the limitation of fuzzy fault tree analysis. To demonstrate the applicability of the proposed approach, a case study is performed and its results are then compared to the results analyzed by a conventional fault tree analysis. The results confirm that the proposed fuzzy probability based fault tree analysis is feasible to propagate and quantify epistemic uncertainties in fault tree analysis

  13. Quantifying uncertainties in the structural response of SSME blades

    Science.gov (United States)

    Nagpal, Vinod K.

    1987-01-01

    To quantify the uncertainties associated with the geometry and material properties of a Space Shuttle Main Engine (SSME) turbopump blade, a computer code known as STAEBL was used. A finite element model of the blade used 80 triangular shell elements with 55 nodes and five degrees of freedom per node. The whole study was simulated on the computer and no real experiments were conducted. The structural response has been evaluated in terms of three variables which are natural frequencies, root (maximum) stress, and blade tip displacements. The results of the study indicate that only the geometric uncertainties have significant effects on the response. Uncertainties in material properties have insignificant effects.

  14. Quantifying the interplay between environmental and social effects on aggregated-fish dynamics.

    Directory of Open Access Journals (Sweden)

    Manuela Capello

    Full Text Available Demonstrating and quantifying the respective roles of social interactions and external stimuli governing fish dynamics is key to understanding fish spatial distribution. If seminal studies have contributed to our understanding of fish spatial organization in schools, little experimental information is available on fish in their natural environment, where aggregations often occur in the presence of spatial heterogeneities. Here, we applied novel modeling approaches coupled to accurate acoustic tracking for studying the dynamics of a group of gregarious fish in a heterogeneous environment. To this purpose, we acoustically tracked with submeter resolution the positions of twelve small pelagic fish (Selar crumenophthalmus in the presence of an anchored floating object, constituting a point of attraction for several fish species. We constructed a field-based model for aggregated-fish dynamics, deriving effective interactions for both social and external stimuli from experiments. We tuned the model parameters that best fit the experimental data and quantified the importance of social interactions in the aggregation, providing an explanation for the spatial structure of fish aggregations found around floating objects. Our results can be generalized to other gregarious species and contexts as long as it is possible to observe the fine-scale movements of a subset of individuals.

  15. Quantifying the measurement uncertainty of results from environmental analytical methods.

    Science.gov (United States)

    Moser, J; Wegscheider, W; Sperka-Gottlieb, C

    2001-07-01

    The Eurachem-CITAC Guide Quantifying Uncertainty in Analytical Measurement was put into practice in a public laboratory devoted to environmental analytical measurements. In doing so due regard was given to the provisions of ISO 17025 and an attempt was made to base the entire estimation of measurement uncertainty on available data from the literature or from previously performed validation studies. Most environmental analytical procedures laid down in national or international standards are the result of cooperative efforts and put into effect as part of a compromise between all parties involved, public and private, that also encompasses environmental standards and statutory limits. Central to many procedures is the focus on the measurement of environmental effects rather than on individual chemical species. In this situation it is particularly important to understand the measurement process well enough to produce a realistic uncertainty statement. Environmental analytical methods will be examined as far as necessary, but reference will also be made to analytical methods in general and to physical measurement methods where appropriate. This paper describes ways and means of quantifying uncertainty for frequently practised methods of environmental analysis. It will be shown that operationally defined measurands are no obstacle to the estimation process as described in the Eurachem/CITAC Guide if it is accepted that the dominating component of uncertainty comes from the actual practice of the method as a reproducibility standard deviation.

  16. Quantifying phenomenological importance in best-estimate plus uncertainty analyses

    International Nuclear Information System (INIS)

    Martin, Robert P.

    2009-01-01

    This paper describes a general methodology for quantifying the importance of specific phenomenological elements to analysis measures evaluated from non-parametric best-estimate plus uncertainty evaluation methodologies. The principal objective of an importance analysis is to reveal those uncertainty contributors having the greatest influence on key analysis measures. This characterization supports the credibility of the uncertainty analysis, the applicability of the analytical tools, and even the generic evaluation methodology through the validation of the engineering judgments that guided the evaluation methodology development. A demonstration of the importance analysis is provided using data from a sample problem considered in the development of AREVA's Realistic LBLOCA methodology. The results are presented against the original large-break LOCA Phenomena Identification and Ranking Table developed by the Technical Program Group responsible for authoring the Code Scaling, Applicability and Uncertainty methodology. (author)

  17. Quantifying chemical uncertainties in simulations of the ISM

    Science.gov (United States)

    Glover, Simon

    2018-06-01

    The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.

  18. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    Science.gov (United States)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  19. Quantifying Uncertainty in Satellite-Retrieved Land Surface Temperature from Cloud Detection Errors

    Directory of Open Access Journals (Sweden)

    Claire E. Bulgin

    2018-04-01

    Full Text Available Clouds remain one of the largest sources of uncertainty in remote sensing of surface temperature in the infrared, but this uncertainty has not generally been quantified. We present a new approach to do so, applied here to the Advanced Along-Track Scanning Radiometer (AATSR. We use an ensemble of cloud masks based on independent methodologies to investigate the magnitude of cloud detection uncertainties in area-average Land Surface Temperature (LST retrieval. We find that at a grid resolution of 625 km 2 (commensurate with a 0.25 ∘ grid size at the tropics, cloud detection uncertainties are positively correlated with cloud-cover fraction in the cell and are larger during the day than at night. Daytime cloud detection uncertainties range between 2.5 K for clear-sky fractions of 10–20% and 1.03 K for clear-sky fractions of 90–100%. Corresponding night-time uncertainties are 1.6 K and 0.38 K, respectively. Cloud detection uncertainty shows a weaker positive correlation with the number of biomes present within a grid cell, used as a measure of heterogeneity in the background against which the cloud detection must operate (e.g., surface temperature, emissivity and reflectance. Uncertainty due to cloud detection errors is strongly dependent on the dominant land cover classification. We find cloud detection uncertainties of a magnitude of 1.95 K over permanent snow and ice, 1.2 K over open forest, 0.9–1 K over bare soils and 0.09 K over mosaic cropland, for a standardised clear-sky fraction of 74.2%. As the uncertainties arising from cloud detection errors are of a significant magnitude for many surface types and spatially heterogeneous where land classification varies rapidly, LST data producers are encouraged to quantify cloud-related uncertainties in gridded products.

  20. A Bayesian statistical method for quantifying model form uncertainty and two model combination methods

    International Nuclear Information System (INIS)

    Park, Inseok; Grandhi, Ramana V.

    2014-01-01

    Apart from parametric uncertainty, model form uncertainty as well as prediction error may be involved in the analysis of engineering system. Model form uncertainty, inherently existing in selecting the best approximation from a model set cannot be ignored, especially when the predictions by competing models show significant differences. In this research, a methodology based on maximum likelihood estimation is presented to quantify model form uncertainty using the measured differences of experimental and model outcomes, and is compared with a fully Bayesian estimation to demonstrate its effectiveness. While a method called the adjustment factor approach is utilized to propagate model form uncertainty alone into the prediction of a system response, a method called model averaging is utilized to incorporate both model form uncertainty and prediction error into it. A numerical problem of concrete creep is used to demonstrate the processes for quantifying model form uncertainty and implementing the adjustment factor approach and model averaging. Finally, the presented methodology is applied to characterize the engineering benefits of a laser peening process

  1. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    Science.gov (United States)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  2. Quantifying uncertainty and resilience on coral reefs using a Bayesian approach

    International Nuclear Information System (INIS)

    Van Woesik, R

    2013-01-01

    Coral reefs are rapidly deteriorating globally. The contemporary management option favors managing for resilience to provide reefs with the capacity to tolerate human-induced disturbances. Yet resilience is most commonly defined as the capacity of a system to absorb disturbances without changing fundamental processes or functionality. Quantifying no change, or the uncertainty of a null hypothesis, is nonsensical using frequentist statistics, but is achievable using a Bayesian approach. This study outlines a practical Bayesian framework that quantifies the resilience of coral reefs using two inter-related models. The first model examines the functionality of coral reefs in the context of their reef-building capacity, whereas the second model examines the recovery rates of coral cover after disturbances. Quantifying intrinsic rates of increase in coral cover and habitat-specific, steady-state equilibria are useful proxies of resilience. A reduction in the intrinsic rate of increase following a disturbance, or the slowing of recovery over time, can be useful indicators of stress; a change in the steady-state equilibrium suggests a phase shift. Quantifying the uncertainty of key reef-building processes and recovery parameters, and comparing these parameters against benchmarks, facilitates the detection of loss of resilience and provides signals of imminent change. (letter)

  3. Quantifying uncertainty and resilience on coral reefs using a Bayesian approach

    Science.gov (United States)

    van Woesik, R.

    2013-12-01

    Coral reefs are rapidly deteriorating globally. The contemporary management option favors managing for resilience to provide reefs with the capacity to tolerate human-induced disturbances. Yet resilience is most commonly defined as the capacity of a system to absorb disturbances without changing fundamental processes or functionality. Quantifying no change, or the uncertainty of a null hypothesis, is nonsensical using frequentist statistics, but is achievable using a Bayesian approach. This study outlines a practical Bayesian framework that quantifies the resilience of coral reefs using two inter-related models. The first model examines the functionality of coral reefs in the context of their reef-building capacity, whereas the second model examines the recovery rates of coral cover after disturbances. Quantifying intrinsic rates of increase in coral cover and habitat-specific, steady-state equilibria are useful proxies of resilience. A reduction in the intrinsic rate of increase following a disturbance, or the slowing of recovery over time, can be useful indicators of stress; a change in the steady-state equilibrium suggests a phase shift. Quantifying the uncertainty of key reef-building processes and recovery parameters, and comparing these parameters against benchmarks, facilitates the detection of loss of resilience and provides signals of imminent change.

  4. Quantifying uncertainty in Transcranial Magnetic Stimulation - A high resolution simulation study in ICBM space.

    Science.gov (United States)

    Toschi, Nicola; Keck, Martin E; Welt, Tobias; Guerrisi, Maria

    2012-01-01

    Transcranial Magnetic Stimulation offers enormous potential for noninvasive brain stimulation. While it is known that brain tissue significantly "reshapes" induced field and charge distributions, most modeling investigations to-date have focused on single-subject data with limited generality. Further, the effects of the significant uncertainties which exist in the simulation (i.e. brain conductivity distributions) and stimulation (e.g. coil positioning and orientations) setup have not been quantified. In this study, we construct a high-resolution anisotropic head model in standard ICBM space, which can be used as a population-representative standard for bioelectromagnetic simulations. Further, we employ Monte-Carlo simulations in order to quantify how uncertainties in conductivity values propagate all the way to induced field and currents, demonstrating significant, regionally dependent dispersions in values which are commonly assumed "ground truth". This framework can be leveraged in order to quantify the effect of any type of uncertainty in noninvasive brain stimulation and bears relevance in all applications of TMS, both investigative and therapeutic.

  5. Quantifying uncertainty in Bayesian calibrated animal-to-human PBPK models with informative prior distributions

    Science.gov (United States)

    Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...

  6. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    Science.gov (United States)

    Di Vittorio, A. V.; Mao, J.; Shi, X.; Chini, L.; Hurtt, G.; Collins, W. D.

    2018-01-01

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. Here we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO2 in 2004, and generates carbon uncertainty that is equivalent to 80% of the net effects of CO2 and climate and 124% of the effects of nitrogen deposition during 1850-2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. We conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.

  7. Quantifying uncertainty of geological 3D layer models, constructed with a-priori geological expertise

    NARCIS (Netherlands)

    Gunnink, J.J.; Maljers, D.; Hummelman, J.

    2010-01-01

    Uncertainty quantification of geological models that are constructed with additional geological expert-knowledge is not straightforward. To construct sound geological 3D layer models we use a lot of additional knowledge, with an uncertainty that is hard to quantify. Examples of geological expert

  8. Quantifying uncertainty in Gulf of Mexico forecasts stemming from uncertain initial conditions

    KAUST Repository

    Iskandarani, Mohamed; Le Hé naff, Matthieu; Srinivasan, Ashwanth; Knio, Omar

    2016-01-01

    Polynomial Chaos (PC) methods are used to quantify the impacts of initial conditions uncertainties on oceanic forecasts of the Gulf of Mexico circulation. Empirical Orthogonal Functions are used as initial conditions perturbations with their modal

  9. Quantifying the uncertainty of wave energy conversion device cost for policy appraisal: An Irish case study

    International Nuclear Information System (INIS)

    Farrell, Niall; Donoghue, Cathal O’; Morrissey, Karyn

    2015-01-01

    Wave Energy Conversion (WEC) devices are at a pre-commercial stage of development with feasibility studies sensitive to uncertainties surrounding assumed input costs. This may affect decision making. This paper analyses the impact these uncertainties may have on investor, developer and policymaker decisions using an Irish case study. Calibrated to data present in the literature, a probabilistic methodology is shown to be an effective means to carry this out. Value at Risk (VaR) and Conditional Value at Risk (CVaR) metrics are used to quantify the certainty of achieving a given cost or return on investment. We analyse the certainty of financial return provided by the proposed Irish Feed-in Tariff (FiT) policy. The influence of cost reduction through bulk discount is also discussed, with cost reduction targets for developers identified. Uncertainty is found to have a greater impact on the profitability of smaller installations and those subject to lower rates of cost reduction. This paper emphasises that a premium is required to account for cost uncertainty when setting FiT rates. By quantifying uncertainty, a means to specify an efficient premium is presented. - Highlights: • Probabilistic model quantifies uncertainty for wave energy feasibility analyses. • Methodology presented and applied to an Irish case study. • A feed-in tariff premium of 3–4 c/kWh required to account for cost uncertainty. • Sensitivity of uncertainty and cost to rates of technological change analysed. • Use of probabilistic model for investors and developers also demonstrated

  10. Quantifying uncertainties of seismic Bayesian inversion of Northern Great Plains

    Science.gov (United States)

    Gao, C.; Lekic, V.

    2017-12-01

    Elastic waves excited by earthquakes are the fundamental observations of the seismological studies. Seismologists measure information such as travel time, amplitude, and polarization to infer the properties of earthquake source, seismic wave propagation, and subsurface structure. Across numerous applications, seismic imaging has been able to take advantage of complimentary seismic observables to constrain profiles and lateral variations of Earth's elastic properties. Moreover, seismic imaging plays a unique role in multidisciplinary studies of geoscience by providing direct constraints on the unreachable interior of the Earth. Accurate quantification of uncertainties of inferences made from seismic observations is of paramount importance for interpreting seismic images and testing geological hypotheses. However, such quantification remains challenging and subjective due to the non-linearity and non-uniqueness of geophysical inverse problem. In this project, we apply a reverse jump Markov chain Monte Carlo (rjMcMC) algorithm for a transdimensional Bayesian inversion of continental lithosphere structure. Such inversion allows us to quantify the uncertainties of inversion results by inverting for an ensemble solution. It also yields an adaptive parameterization that enables simultaneous inversion of different elastic properties without imposing strong prior information on the relationship between them. We present retrieved profiles of shear velocity (Vs) and radial anisotropy in Northern Great Plains using measurements from USArray stations. We use both seismic surface wave dispersion and receiver function data due to their complementary constraints of lithosphere structure. Furthermore, we analyze the uncertainties of both individual and joint inversion of those two data types to quantify the benefit of doing joint inversion. As an application, we infer the variation of Moho depths and crustal layering across the northern Great Plains.

  11. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  12. Quantifying uncertainty and trade-offs in resilience assessments

    Directory of Open Access Journals (Sweden)

    Craig R. Allen

    2018-03-01

    Full Text Available Several frameworks have been developed to assess the resilience of social-ecological systems, but most require substantial data inputs, time, and technical expertise. Stakeholders and practitioners often lack the resources for such intensive efforts. Furthermore, most end with problem framing and fail to explicitly address trade-offs and uncertainty. To remedy this gap, we developed a rapid survey assessment that compares the relative resilience of social-ecological systems with respect to a number of resilience properties. This approach generates large amounts of information relative to stakeholder inputs. We targeted four stakeholder categories: government (policy, regulation, management, end users (farmers, ranchers, landowners, industry, agency/public science (research, university, extension, and NGOs (environmental, citizen, social justice in four North American watersheds, to assess social-ecological resilience through surveys. Conceptually, social-ecological systems are comprised of components ranging from strictly human to strictly ecological, but that relate directly or indirectly to one another. They have soft boundaries and several important dimensions or axes that together describe the nature of social-ecological interactions, e.g., variability, diversity, modularity, slow variables, feedbacks, capital, innovation, redundancy, and ecosystem services. There is no absolute measure of resilience, so our design takes advantage of cross-watershed comparisons and therefore focuses on relative resilience. Our approach quantifies and compares the relative resilience across watershed systems and potential trade-offs among different aspects of the social-ecological system, e.g., between social, economic, and ecological contributions. This approach permits explicit assessment of several types of uncertainty (e.g., self-assigned uncertainty for stakeholders; uncertainty across respondents, watersheds, and subsystems, and subjectivity in

  13. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  14. A review on the CIRCE methodology to quantify the uncertainty of the physical models of a code

    International Nuclear Information System (INIS)

    Jeon, Seong Su; Hong, Soon Joon; Bang, Young Seok

    2012-01-01

    In the field of nuclear engineering, recent regulatory audit calculations of large break loss of coolant accident (LBLOCA) have been performed with the best estimate code such as MARS, RELAP5 and CATHARE. Since the credible regulatory audit calculation is very important in the evaluation of the safety of the nuclear power plant (NPP), there have been many researches to develop rules and methodologies for the use of best estimate codes. One of the major points is to develop the best estimate plus uncertainty (BEPU) method for uncertainty analysis. As a representative BEPU method, NRC proposes the CSAU (Code scaling, applicability and uncertainty) methodology, which clearly identifies the different steps necessary for an uncertainty analysis. The general idea is 1) to determine all the sources of uncertainty in the code, also called basic uncertainties, 2) quantify them and 3) combine them in order to obtain the final uncertainty for the studied application. Using the uncertainty analysis such as CSAU methodology, an uncertainty band for the code response (calculation result), important from the safety point of view is calculated and the safety margin of the NPP is quantified. An example of such a response is the peak cladding temperature (PCT) for a LBLOCA. However, there is a problem in the uncertainty analysis with the best estimate codes. Generally, it is very difficult to determine the uncertainties due to the empiricism of closure laws (also called correlations or constitutive relationships). So far the only proposed approach is based on the expert judgment. For this case, the uncertainty range of important parameters can be wide and inaccurate so that the confidence level of the BEPU calculation results can be decreased. In order to solve this problem, recently CEA (France) proposes a statistical method of data analysis, called CIRCE. The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment

  15. Quantifying Uncertainty in Flood Inundation Mapping Using Streamflow Ensembles and Multiple Hydraulic Modeling Techniques

    Science.gov (United States)

    Hosseiny, S. M. H.; Zarzar, C.; Gomez, M.; Siddique, R.; Smith, V.; Mejia, A.; Demir, I.

    2016-12-01

    The National Water Model (NWM) provides a platform for operationalize nationwide flood inundation forecasting and mapping. The ability to model flood inundation on a national scale will provide invaluable information to decision makers and local emergency officials. Often, forecast products use deterministic model output to provide a visual representation of a single inundation scenario, which is subject to uncertainty from various sources. While this provides a straightforward representation of the potential inundation, the inherent uncertainty associated with the model output should be considered to optimize this tool for decision making support. The goal of this study is to produce ensembles of future flood inundation conditions (i.e. extent, depth, and velocity) to spatially quantify and visually assess uncertainties associated with the predicted flood inundation maps. The setting for this study is located in a highly urbanized watershed along the Darby Creek in Pennsylvania. A forecasting framework coupling the NWM with multiple hydraulic models was developed to produce a suite ensembles of future flood inundation predictions. Time lagged ensembles from the NWM short range forecasts were used to account for uncertainty associated with the hydrologic forecasts. The forecasts from the NWM were input to iRIC and HEC-RAS two-dimensional software packages, from which water extent, depth, and flow velocity were output. Quantifying the agreement between output ensembles for each forecast grid provided the uncertainty metrics for predicted flood water inundation extent, depth, and flow velocity. For visualization, a series of flood maps that display flood extent, water depth, and flow velocity along with the underlying uncertainty associated with each of the forecasted variables were produced. The results from this study demonstrate the potential to incorporate and visualize model uncertainties in flood inundation maps in order to identify the high flood risk zones.

  16. Quantifying and predicting interpretational uncertainty in cross-sections

    Science.gov (United States)

    Randle, Charles; Bond, Clare; Monaghan, Alison; Lark, Murray

    2015-04-01

    Cross-sections are often constructed from data to create a visual impression of the geologist's interpretation of the sub-surface geology. However as with all interpretations, this vision of the sub-surface geology is uncertain. We have designed and carried out an experiment with the aim of quantifying the uncertainty in geological cross-sections created by experts interpreting borehole data. By analysing different attributes of the data and interpretations we reflect on the main controls on uncertainty. A group of ten expert modellers at the British Geological Survey were asked to interpret an 11.4 km long cross-section from south-east Glasgow, UK. The data provided consisted of map and borehole data of the superficial deposits and shallow bedrock. Each modeller had a unique set of 11 boreholes removed from their dataset, to which their interpretations of the top of the bedrock were compared. This methodology allowed quantification of how far from the 'correct answer' each interpretation is at 11 points along each interpreted cross-section line; through comparison of the interpreted and actual bedrock elevations in the boreholes. This resulted in the collection of 110 measurements of the error to use in further analysis. To determine the potential control on uncertainty various attributes relating to the modeller, the interpretation and the data were recorded. Modellers were asked to fill out a questionnaire asking for information; such as how much 3D modelling experience they had, and how long it took them to complete the interpretation. They were also asked to record their confidence in their interpretations graphically, in the form of a confidence level drawn onto the cross-section. Initial analysis showed the majority of the experts' interpreted bedrock elevations within 5 metres of those recorded in the withheld boreholes. Their distribution is peaked and symmetrical about a mean of zero, indicating that there was no tendency for the experts to either under

  17. New strategies for quantifying and propagating nuclear data uncertainty in CUSA

    International Nuclear Information System (INIS)

    Zhao, Qiang; Zhang, Chunyan; Hao, Chen; Li, Fu; Wang, Dongyong; Yu, Yan

    2016-01-01

    Highlights: • An efficient sampling method based on LHS combined with Cholesky decomposition conversion is proposed. • A code of generating multi-group covariance matrices has been developed. • The uncertainty and sensitivity results of CUSA agree well with TSUNAMI-1D. - Abstract: The uncertainties of nuclear cross sections are propagated to the key parameters of nuclear reactor core through transport calculation. The statistical sampling method can be used to quantify and propagate nuclear data uncertainty in nuclear reactor physics calculations. In order to use statistical sampling method two key technical problems, method of generating multi-group covariance matrices and sampling method, should be considered reasonably and efficiently. In this paper, a method of transforming nuclear cross section covariance matrix in multi-group form into users' group structures based on the flat-flux approximation has been studied in depth. And most notably, an efficient sampling method has been proposed, which is based on Latin Hypercube Sampling (LHS) combined with Cholesky decomposition conversion. Based on those method, two modules named T-COCCO and GUIDE have been developed and have been successfully added into the code for uncertainty and sensitivity analysis (CUSA). The new modules have been verified respectively. Numerical results for the TMI-1 pin-cell case are presented and compared to TSUNAMI-1D. The comparison of the results further support that the methods and the computational tool developed in this work can be used to conduct sensitivity and uncertainty analysis for nuclear cross sections.

  18. New strategies for quantifying and propagating nuclear data uncertainty in CUSA

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Qiang; Zhang, Chunyan [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); Hao, Chen, E-mail: haochen.heu@163.com [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); Li, Fu [Institute of Nuclear and New Energy Technology(INET), Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Tsinghua University, Beijing (China); Wang, Dongyong [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China); School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an (China); Yu, Yan [Fundamental Science on Nuclear Safety and Simulation Technology Laboratory, College of Nuclear Science and Technology, Harbin Engineering University, Harbin (China)

    2016-10-15

    Highlights: • An efficient sampling method based on LHS combined with Cholesky decomposition conversion is proposed. • A code of generating multi-group covariance matrices has been developed. • The uncertainty and sensitivity results of CUSA agree well with TSUNAMI-1D. - Abstract: The uncertainties of nuclear cross sections are propagated to the key parameters of nuclear reactor core through transport calculation. The statistical sampling method can be used to quantify and propagate nuclear data uncertainty in nuclear reactor physics calculations. In order to use statistical sampling method two key technical problems, method of generating multi-group covariance matrices and sampling method, should be considered reasonably and efficiently. In this paper, a method of transforming nuclear cross section covariance matrix in multi-group form into users' group structures based on the flat-flux approximation has been studied in depth. And most notably, an efficient sampling method has been proposed, which is based on Latin Hypercube Sampling (LHS) combined with Cholesky decomposition conversion. Based on those method, two modules named T-COCCO and GUIDE have been developed and have been successfully added into the code for uncertainty and sensitivity analysis (CUSA). The new modules have been verified respectively. Numerical results for the TMI-1 pin-cell case are presented and compared to TSUNAMI-1D. The comparison of the results further support that the methods and the computational tool developed in this work can be used to conduct sensitivity and uncertainty analysis for nuclear cross sections.

  19. The accountability imperative for quantifying the uncertainty of emission forecasts: evidence from Mexico

    DEFF Research Database (Denmark)

    Puig, Daniel; Morales-Nápoles, Oswaldo; Bakhtiari, Fatemeh

    2017-01-01

    forecasting approaches can reflect prevailing uncertainties. We apply a transparent and replicable method to quantify the uncertainty associated with projections of gross domestic product growth rates for Mexico, a key driver of GHG emissions in the country. We use those projections to produce probabilistic...... forecasts of GHG emissions for Mexico. We contrast our probabilistic forecasts with Mexico’s governmental deterministic forecasts. We show that, because they fail to reflect such key uncertainty, deterministic forecasts are ill-suited for use in target-setting processes. We argue that (i) guidelines should...... be agreed upon, to ensure that governmental forecasts meet certain minimum transparency and quality standards, and (ii) governments should be held accountable for the appropriateness of the forecasting approach applied to prepare governmental forecasts, especially when those forecasts are used to derive...

  20. Quantifying the contribution of the root system of alpine vegetation in the soil aggregate stability of moraine

    Directory of Open Access Journals (Sweden)

    Csilla Hudek

    2017-03-01

    Full Text Available One fifth of the world's population is living in mountains or in their surrounding areas. This anthropogenic pressure continues to grow with the increasing number of settlements, especially in areas connected to touristic activities, such as the Italian Alps. The process of soil formation on high mountains is particularly slow and these soils are particularly vulnerable to soil degradation. In alpine regions, extreme meteorological events are increasingly frequent due to climate change, speeding up the process of soil degradation and increasing the number of severe erosion processes, shallow landslides and debris flows. Vegetation cover plays a crucial role in the stabilization of mountain soils thereby reducing the risk of natural hazards effecting downslope areas. Soil aggregate stability is one of the main soil properties that can be linked to soil loss processes. Soils developed on moraines in recently deglaciated areas typically have low levels of soil aggregation, and a limited or discontinuous vegetation cover making them more susceptible to degradation. However, soil structure can be influenced by the root system of the vegetation. Roots are actively involved in the formation of water-stable soil aggregation, increasing the stability of the soil and its nutrient content. In the present study, we aim to quantify the effect of the root system of alpine vegetation on the soil aggregate stability of the forefield of the Lys glacier, in the Aosta Valley (NW-Italy. This proglacial area provides the opportunity to study how the root system of ten pioneer alpine species from different successional stages can contribute to soil development and soil stabilization. To quantify the aggregate stability of root permeated soils, a modified wet sieving method was employed. The root length per soil volume of the different species was also determined and later correlated with the aggregate stability results. The results showed that soil aggregate

  1. SU-E-J-159: Intra-Patient Deformable Image Registration Uncertainties Quantified Using the Distance Discordance Metric

    International Nuclear Information System (INIS)

    Saleh, Z; Thor, M; Apte, A; Deasy, J; Sharp, G; Muren, L

    2014-01-01

    Purpose: The quantitative evaluation of deformable image registration (DIR) is currently challenging due to lack of a ground truth. In this study we test a new method proposed for quantifying multiple-image based DIRrelated uncertainties, for DIR of pelvic images. Methods: 19 patients were analyzed, each with 6 CT scans, who previously had radiotherapy for prostate cancer. Manually delineated structures for rectum and bladder, which served as ground truth structures, were delineated on the planning CT and each subsequent scan. For each patient, voxel-by-voxel DIR-related uncertainties were evaluated, following B-spline based DIR, by applying a previously developed metric, the distance discordance metric (DDM; Saleh et al., PMB (2014) 59:733). The DDM map was superimposed on the first acquired CT scan and DDM statistics were assessed, also relative to two metrics estimating the agreement between the propagated and the manually delineated structures. Results: The highest DDM values which correspond to greatest spatial uncertainties were observed near the body surface and in the bowel due to the presence of gas. The mean rectal and bladder DDM values ranged from 1.1–11.1 mm and 1.5–12.7 mm, respectively. There was a strong correlation in the DDMs between the rectum and bladder (Pearson R = 0.68 for the max DDM). For both structures, DDM was correlated with the ratio between the DIR-propagated and manually delineated volumes (R = 0.74 for the max rectal DDM). The maximum rectal DDM was negatively correlated with the Dice Similarity Coefficient between the propagated and the manually delineated volumes (R= −0.52). Conclusion: The multipleimage based DDM map quantified considerable DIR variability across different structures and among patients. Besides using the DDM for quantifying DIR-related uncertainties it could potentially be used to adjust for uncertainties in DIR-based accumulated dose distributions

  2. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    Science.gov (United States)

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Quantifying geological uncertainty for flow and transport modeling in multi-modal heterogeneous formations

    Science.gov (United States)

    Feyen, Luc; Caers, Jef

    2006-06-01

    In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport

  4. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    Science.gov (United States)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed

  5. Testing methodologies for quantifying physical models uncertainties. A comparative exercise using CIRCE and IPREM (FFTBM)

    Energy Technology Data Exchange (ETDEWEB)

    Freixa, Jordi, E-mail: jordi.freixa-terradas@upc.edu; Alfonso, Elsa de, E-mail: elsa.de.alfonso@upc.edu; Reventós, Francesc, E-mail: francesc.reventos@upc.edu

    2016-08-15

    Highlights: • Uncertainty of physical models are a key issue in Best estimate plus uncertainty analysis. • Estimation of uncertainties of physical models of thermal hydraulics system codes. • Comparison of CIRCÉ and FFTBM methodologies. • Simulation of reflood experiments in order to evaluate uncertainty of physical models related to the reflood scenario. - Abstract: The increasing importance of Best-Estimate Plus Uncertainty (BEPU) analyses in nuclear safety and licensing processes have lead to several international activities. The latest findings highlighted the uncertainties of physical models as one of the most controversial aspects of BEPU. This type of uncertainties is an important contributor to the total uncertainty of NPP BE calculations. Due to the complexity of estimating this uncertainty, it is often assessed solely by engineering judgment. The present study comprises a comparison of two different state-of-the-art methodologies CIRCÉ and IPREM (FFTBM) capable of quantifying the uncertainty of physical models. Similarities and differences of their results are discussed through the observation of probability distribution functions and envelope calculations. In particular, the analyzed scenario is core reflood. Experimental data from the FEBA and PERICLES test facilities is employed while the thermal hydraulic simulations are carried out with RELAP5/mod3.3. This work is undertaken under the framework of PREMIUM (Post-BEMUSE Reflood Model Input Uncertainty Methods) benchmark.

  6. Quantifying uncertainties of permafrost carbon–climate feedbacks

    Directory of Open Access Journals (Sweden)

    E. J. Burke

    2017-06-01

    Full Text Available The land surface models JULES (Joint UK Land Environment Simulator, two versions and ORCHIDEE-MICT (Organizing Carbon and Hydrology in Dynamic Ecosystems, each with a revised representation of permafrost carbon, were coupled to the Integrated Model Of Global Effects of climatic aNomalies (IMOGEN intermediate-complexity climate and ocean carbon uptake model. IMOGEN calculates atmospheric carbon dioxide (CO2 and local monthly surface climate for a given emission scenario with the land–atmosphere CO2 flux exchange from either JULES or ORCHIDEE-MICT. These simulations include feedbacks associated with permafrost carbon changes in a warming world. Both IMOGEN–JULES and IMOGEN–ORCHIDEE-MICT were forced by historical and three alternative future-CO2-emission scenarios. Those simulations were performed for different climate sensitivities and regional climate change patterns based on 22 different Earth system models (ESMs used for CMIP3 (phase 3 of the Coupled Model Intercomparison Project, allowing us to explore climate uncertainties in the context of permafrost carbon–climate feedbacks. Three future emission scenarios consistent with three representative concentration pathways were used: RCP2.6, RCP4.5 and RCP8.5. Paired simulations with and without frozen carbon processes were required to quantify the impact of the permafrost carbon feedback on climate change. The additional warming from the permafrost carbon feedback is between 0.2 and 12 % of the change in the global mean temperature (ΔT by the year 2100 and 0.5 and 17 % of ΔT by 2300, with these ranges reflecting differences in land surface models, climate models and emissions pathway. As a percentage of ΔT, the permafrost carbon feedback has a greater impact on the low-emissions scenario (RCP2.6 than on the higher-emissions scenarios, suggesting that permafrost carbon should be taken into account when evaluating scenarios of heavy mitigation and stabilization

  7. Data Fusion: A decision analysis tool that quantifies geological and parametric uncertainty

    International Nuclear Information System (INIS)

    Porter, D.W.

    1995-01-01

    Engineering projects such as siting waste facilities and performing remediation are often driven by geological and hydrogeological uncertainties. Geological understanding and hydrogeological parameters such as hydraulic conductivity are needed to achieve reliable engineering design. Information form non-invasive and minimal invasive data sets offers potential for reduction in uncertainty, but a single data type does not usually meet all needs. Data Fusion uses Bayesian statistics to update prior knowledge with information from diverse data sets as the data is acquired. Prior knowledge takes the form of first principles models (e.g., groundwater flow) and spatial continuity models for heterogeneous properties. The variability of heterogeneous properties is modeled in a form motivated by statistical physics as a Markov random field. A computer reconstruction of targets of interest is produced within a quantified statistical uncertainty. The computed uncertainty provides a rational basis for identifying data gaps for assessing data worth to optimize data acquisition. Further, the computed uncertainty provides a way to determine the confidence of achieving adequate safety, margins in engineering design. Beyond design, Data Fusion provides the basis for real time computer monitoring of remediation. Working with the DOE Office of Technology (OTD), the authors have developed and patented a Data Fusion Workstation system that has been used on jobs at the Hanford, Savannah River, Pantex and Fernald DOE sites. Further, applications include an army depot at Letterkenney, PA and commercial industrial sites

  8. Data Fusion: A decision analysis tool that quantifies geological and parametric uncertainty

    International Nuclear Information System (INIS)

    Porter, D.W.

    1996-01-01

    Engineering projects such as siting waste facilities and performing remediation are often driven by geological and hydrogeological uncertainties. Geological understanding and hydrogeological parameters such as hydraulic conductivity are needed to achieve reliable engineering design. Information from non-invasive and minimally invasive data sets offers potential for reduction in uncertainty, but a single data type does not usually meet all needs. Data Fusion uses Bayesian statistics to update prior knowledge with information from diverse data sets as the data is acquired. Prior knowledge takes the form of first principles models (e.g., groundwater flow) and spatial continuity models for heterogeneous properties. The variability of heterogeneous properties is modeled in a form motivated by statistical physics as a Markov random field. A computer reconstruction of targets of interest is produced within a quantified statistical uncertainty. The computed uncertainty provides a rational basis for identifying data gaps for assessing data worth to optimize data acquisition. Further, the computed uncertainty provides a way to determine the confidence of achieving adequate safety margins in engineering design. Beyond design, Data Fusion provides the basis for real time computer monitoring of remediation. Working with the DOE Office of Technology (OTD), the author has developed and patented a Data Fusion Workstation system that has been used on jobs at the Hanford, Savannah River, Pantex and Fernald DOE sites. Further applications include an army depot at Letterkenney, PA and commercial industrial sites

  9. Quantifying reactor safety margins: Application of code scaling, applicability, and uncertainty evaluation methodology to a large-break, loss-of-coolant accident

    International Nuclear Information System (INIS)

    Boyack, B.; Duffey, R.; Wilson, G.; Griffith, P.; Lellouche, G.; Levy, S.; Rohatgi, U.; Wulff, W.; Zuber, N.

    1989-12-01

    The US Nuclear Regulatory Commission (NRC) has issued a revised rule for loss-of-coolant accident/emergency core cooling system (ECCS) analysis of light water reactors to allow the use of best-estimate computer codes in safety analysis as an option. A key feature of this option requires the licensee to quantify the uncertainty of the calculations and include that uncertainty when comparing the calculated results with acceptance limits provided in 10 CFR Part 50. To support the revised ECCS rule and illustrate its application, the NRC and its contractors and consultants have developed and demonstrated an uncertainty evaluation methodology called code scaling, applicability, and uncertainty (CSAU). The CSAU methodology and an example application described in this report demonstrate that uncertainties in complex phenomena can be quantified. The methodology is structured, traceable, and practical, as is needed in the regulatory arena. The methodology is systematic and comprehensive as it addresses and integrates the scenario, experiments, code, and plant to resolve questions concerned with: (a) code capability to scale-up processes from test facility to full-scale nuclear power plants; (b) code applicability to safety studies of a postulated accident scenario in a specified nuclear power plant; and (c) quantifying uncertainties of calculated results. 127 refs., 55 figs., 40 tabs

  10. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  11. Methodology for quantifying uncertainty in coal assessments with an application to a Texas lignite deposit

    Energy Technology Data Exchange (ETDEWEB)

    Olea, Ricardo A.; Luppens, James A.; Tewalt, Susan J. [U.S. Geological Survey, Reston, VA (United States)

    2011-01-01

    A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. (author)

  12. Methodology for quantifying uncertainty in coal assessments with an application to a Texas lignite deposit

    Science.gov (United States)

    Olea, R.A.; Luppens, J.A.; Tewalt, S.J.

    2011-01-01

    A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.

  13. Uncertainty Management of Dynamic Tariff Method for Congestion Management in Distribution Networks

    DEFF Research Database (Denmark)

    Huang, Shaojun; Wu, Qiuwei; Cheng, Lin

    2016-01-01

    The dynamic tariff (DT) method is designed for the distribution system operator (DSO) to alleviate congestions that might occur in a distribution network with high penetration of distributed energy resources (DERs). Uncertainty management is required for the decentralized DT method because the DT...... is de- termined based on optimal day-ahead energy planning with forecasted parameters such as day-ahead energy prices and en- ergy needs which might be different from the parameters used by aggregators. The uncertainty management is to quantify and mitigate the risk of the congestion when employing...

  14. The accountability imperative for quantifying the uncertainty of emission forecasts: evidence from Mexico

    International Nuclear Information System (INIS)

    Puig, Daniel; Bakhtiari, Fatemeh; Morales-Napoles, Oswaldo; Landa Rivera, Gissela

    2017-01-01

    Governmental climate change mitigation targets are typically developed with the aid of forecasts of greenhouse-gas emissions. The robustness and credibility of such forecasts depends, among other issues, on the extent to which forecasting approaches can reflect prevailing uncertainties. We apply a transparent and replicable method to quantify the uncertainty associated with projections of gross domestic product growth rates for Mexico, a key driver of greenhouse-gas emissions in the country. We use those projections to produce probabilistic forecasts of greenhouse-gas emissions for Mexico. We contrast our probabilistic forecasts with Mexico's governmental deterministic forecasts. We show that, because they fail to reflect such key uncertainty, deterministic forecasts are ill-suited for use in target-setting processes. We argue that (i) guidelines should be agreed upon, to ensure that governmental forecasts meet certain minimum transparency and quality standards, and (ii) governments should be held accountable for the appropriateness of the forecasting approach applied to prepare governmental forecasts, especially when those forecasts are used to derive climate change mitigation targets. (authors)

  15. Uncertainties in Forecasting Streamflow using Entropy Theory

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  16. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    International Nuclear Information System (INIS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-01-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has

  17. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach

  18. Information Aggregation in Organizations

    OpenAIRE

    Schulte, Elisabeth

    2006-01-01

    This dissertation contributes to the analysis of information aggregation procedures within organizations. Facing uncertainty about the consequences of a collective decision, information has to be aggregated before making a choice. Two main questions are addressed. Firstly, how well is an organization suited for the aggregation of decision-relevant information? Secondly, how should an organization be designed in order to aggregate information efficiently? The main part deals with information a...

  19. A framework to quantify uncertainty in simulations of oil transport in the ocean

    KAUST Repository

    Gonçalves, Rafael C.

    2016-03-02

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model\\'s output to be presented in a probabilistic framework so that the model\\'s predictions reflect the uncertainty in the model\\'s input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model\\'s uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.

  20. A framework to quantify uncertainty in simulations of oil transport in the ocean

    KAUST Repository

    Gonç alves, Rafael C.; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Chassignet, Eric; Knio, Omar

    2016-01-01

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model's output to be presented in a probabilistic framework so that the model's predictions reflect the uncertainty in the model's input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model's uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.

  1. A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer, F.; Clifton, Andrew; Bonin, Timothy A.; Churchfield, Matthew J.

    2017-03-24

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards discuss uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device. However, real-world experience has shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we propose the development of a new lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from an operational wind farm to assess the ability of the framework to predict errors in lidar-measured wind speed.

  2. A review of different perspectives on uncertainty and risk and an alternative modeling paradigm

    International Nuclear Information System (INIS)

    Samson, Sundeep; Reneke, James A.; Wiecek, Margaret M.

    2009-01-01

    The literature in economics, finance, operations research, engineering and in general mathematics is first reviewed on the subject of defining uncertainty and risk. The review goes back to 1901. Different perspectives on uncertainty and risk are examined and a new paradigm to model uncertainty and risk is proposed using relevant ideas from this study. This new paradigm is used to represent, aggregate and propagate uncertainty and interpret the resulting variability in a challenge problem developed by Oberkampf et al. [2004, Challenge problems: uncertainty in system response given uncertain parameters. Reliab Eng Syst Safety 2004; 85(1): 11-9]. The challenge problem is further extended into a decision problem that is treated within a multicriteria decision making framework to illustrate how the new paradigm yields optimal decisions under uncertainty. The accompanying risk is defined as the probability of an unsatisfactory system response quantified by a random function of the uncertainty

  3. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  4. Enhanced Named Entity Extraction via Error-Driven Aggregation

    Energy Technology Data Exchange (ETDEWEB)

    Lemmond, T D; Perry, N C; Guensche, J W; Nitao, J J; Glaser, R E; Kidwell, P; Hanley, W G

    2010-02-22

    Despite recent advances in named entity extraction technologies, state-of-the-art extraction tools achieve insufficient accuracy rates for practical use in many operational settings. However, they are not generally prone to the same types of error, suggesting that substantial improvements may be achieved via appropriate combinations of existing tools, provided their behavior can be accurately characterized and quantified. In this paper, we present an inference methodology for the aggregation of named entity extraction technologies that is founded upon a black-box analysis of their respective error processes. This method has been shown to produce statistically significant improvements in extraction relative to standard performance metrics and to mitigate the weak performance of entity extractors operating under suboptimal conditions. Moreover, this approach provides a framework for quantifying uncertainty and has demonstrated the ability to reconstruct the truth when majority voting fails.

  5. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  6. WASH-1400: quantifying the uncertainties

    International Nuclear Information System (INIS)

    Erdmann, R.C.; Leverenz, F.L. Jr.; Lellouche, G.S.

    1981-01-01

    The purpose of this paper is to focus on the limitations of the WASH-1400 analysis in estimating the risk from light water reactors (LWRs). This assessment attempts to modify the quantification of the uncertainty in and estimate of risk as presented by the RSS (reactor safety study). 8 refs

  7. Extending Ripley's K-Function to Quantify Aggregation in 2-D Grayscale Images.

    Directory of Open Access Journals (Sweden)

    Mohamed Amgad

    Full Text Available In this work, we describe the extension of Ripley's K-function to allow for overlapping events at very high event densities. We show that problematic edge effects introduce significant bias to the function at very high densities and small radii, and propose a simple correction method that successfully restores the function's centralization. Using simulations of homogeneous Poisson distributions of events, as well as simulations of event clustering under different conditions, we investigate various aspects of the function, including its shape-dependence and correspondence between true cluster radius and radius at which the K-function is maximized. Furthermore, we validate the utility of the function in quantifying clustering in 2-D grayscale images using three modalities: (i Simulations of particle clustering; (ii Experimental co-expression of soluble and diffuse protein at varying ratios; (iii Quantifying chromatin clustering in the nuclei of wt and crwn1 crwn2 mutant Arabidopsis plant cells, using a previously-published image dataset. Overall, our work shows that Ripley's K-function is a valid abstract statistical measure whose utility extends beyond the quantification of clustering of non-overlapping events. Potential benefits of this work include the quantification of protein and chromatin aggregation in fluorescent microscopic images. Furthermore, this function has the potential to become one of various abstract texture descriptors that are utilized in computer-assisted diagnostics in anatomic pathology and diagnostic radiology.

  8. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  9. Treatment of uncertainties in the IPCC: a philosophical analysis

    Science.gov (United States)

    Jebeile, J.; Drouet, I.

    2014-12-01

    The IPCC produces scientific reports out of findings on climate and climate change. Because the findings are uncertain in many respects, the production of reports requires aggregating assessments of uncertainties of different kinds. This difficult task is currently regulated by the Guidance note for lead authors of the IPCC fifth assessment report on consistent treatment of uncertainties. The note recommends that two metrics—i.e. confidence and likelihood— be used for communicating the degree of certainty in findings. Confidence is expressed qualitatively "based on the type, amount, quality, and consistency of evidence […] and the degree of agreement", while likelihood is expressed probabilistically "based on statistical analysis of observations or model results, or expert judgment". Therefore, depending on the evidence evaluated, authors have the choice to present either an assigned level of confidence or a quantified measure of likelihood. But aggregating assessments of uncertainties of these two different kinds express distinct and conflicting methodologies. So the question arises whether the treatment of uncertainties in the IPCC is rationally justified. In order to answer the question, it is worth comparing the IPCC procedures with the formal normative theories of epistemic rationality which have been developed by philosophers. These theories—which include contributions to the philosophy of probability and to bayesian probabilistic confirmation theory—are relevant for our purpose because they are commonly used to assess the rationality of common collective jugement formation based on uncertain knowledge. In this paper we make the comparison and pursue the following objectives: i/we determine whether the IPCC confidence and likelihood can be compared with the notions of uncertainty targeted by or underlying the formal normative theories of epistemic rationality; ii/we investigate whether the formal normative theories of epistemic rationality justify

  10. Quantifying uncertainty in LCA-modelling of waste management systems

    DEFF Research Database (Denmark)

    Clavreul, Julie; Guyonnet, D.; Christensen, Thomas Højlund

    2012-01-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present...... the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining...

  11. Quantifying natural delta variability using a multiple-point geostatistics prior uncertainty model

    Science.gov (United States)

    Scheidt, Céline; Fernandes, Anjali M.; Paola, Chris; Caers, Jef

    2016-10-01

    We address the question of quantifying uncertainty associated with autogenic pattern variability in a channelized transport system by means of a modern geostatistical method. This question has considerable relevance for practical subsurface applications as well, particularly those related to uncertainty quantification relying on Bayesian approaches. Specifically, we show how the autogenic variability in a laboratory experiment can be represented and reproduced by a multiple-point geostatistical prior uncertainty model. The latter geostatistical method requires selection of a limited set of training images from which a possibly infinite set of geostatistical model realizations, mimicking the training image patterns, can be generated. To that end, we investigate two methods to determine how many training images and what training images should be provided to reproduce natural autogenic variability. The first method relies on distance-based clustering of overhead snapshots of the experiment; the second method relies on a rate of change quantification by means of a computer vision algorithm termed the demon algorithm. We show quantitatively that with either training image selection method, we can statistically reproduce the natural variability of the delta formed in the experiment. In addition, we study the nature of the patterns represented in the set of training images as a representation of the "eigenpatterns" of the natural system. The eigenpattern in the training image sets display patterns consistent with previous physical interpretations of the fundamental modes of this type of delta system: a highly channelized, incisional mode; a poorly channelized, depositional mode; and an intermediate mode between the two.

  12. Quantifying uncertainties in precipitation: a case study from Greece

    Directory of Open Access Journals (Sweden)

    C. Anagnostopoulou

    2008-04-01

    Full Text Available The main objective of the present study was the examination and the quantification of the uncertainties in the precipitation time series over the Greek area, for a 42-year time period. The uncertainty index applied to the rainfall data is a combination (total of the departures of the rainfall season length, of the median data of the accumulated percentages and of the total amounts of rainfall. Results of the study indicated that all the stations are characterized, on an average basis, by medium to high uncertainty. The stations that presented an increasing rainfall uncertainty were the ones located mainly to the continental parts of the study region. From the temporal analysis of the uncertainty index, it was demonstrated that the greatest percentage of the years, for all the stations time-series, was characterized by low to high uncertainty (intermediate categories of the index. Most of the results of the uncertainty index for the Greek region are similar to the corresponding results of various stations all over the European region.

  13. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  14. Quantifying the sources of uncertainty in an ensemble of hydrological climate-impact projections

    Science.gov (United States)

    Aryal, Anil; Shrestha, Sangam; Babel, Mukand S.

    2018-01-01

    The objective of this paper is to quantify the various sources of uncertainty in the assessment of climate change impact on hydrology in the Tamakoshi River Basin, located in the north-eastern part of Nepal. Multiple climate and hydrological models were used to simulate future climate conditions and discharge in the basin. The simulated results of future climate and river discharge were analysed for the quantification of sources of uncertainty using two-way and three-way ANOVA. The results showed that temperature and precipitation in the study area are projected to change in near- (2010-2039), mid- (2040-2069) and far-future (2070-2099) periods. Maximum temperature is likely to rise by 1.75 °C under Representative Concentration Pathway (RCP) 4.5 and by 3.52 °C under RCP 8.5. Similarly, the minimum temperature is expected to rise by 2.10 °C under RCP 4.5 and by 3.73 °C under RCP 8.5 by the end of the twenty-first century. Similarly, the precipitation in the study area is expected to change by - 2.15% under RCP 4.5 and - 2.44% under RCP 8.5 scenarios. The future discharge in the study area was projected using two hydrological models, viz. Soil and Water Assessment Tool (SWAT) and Hydrologic Engineering Center's Hydrologic Modelling System (HEC-HMS). The SWAT model projected discharge is expected to change by small amount, whereas HEC-HMS model projected considerably lower discharge in future compared to the baseline period. The results also show that future climate variables and river hydrology contain uncertainty due to the choice of climate models, RCP scenarios, bias correction methods and hydrological models. During wet days, more uncertainty is observed due to the use of different climate models, whereas during dry days, the use of different hydrological models has a greater effect on uncertainty. Inter-comparison of the impacts of different climate models reveals that the REMO climate model shows higher uncertainty in the prediction of precipitation and

  15. Applying an animal model to quantify the uncertainties of an image-based 4D-CT algorithm

    International Nuclear Information System (INIS)

    Pierce, Greg; Battista, Jerry; Wang, Kevin; Lee, Ting-Yim

    2012-01-01

    The purpose of this paper is to use an animal model to quantify the spatial displacement uncertainties and test the fundamental assumptions of an image-based 4D-CT algorithm in vivo. Six female Landrace cross pigs were ventilated and imaged using a 64-slice CT scanner (GE Healthcare) operating in axial cine mode. The breathing amplitude pattern of the pigs was varied by periodically crimping the ventilator gas return tube during the image acquisition. The image data were used to determine the displacement uncertainties that result from matching CT images at the same respiratory phase using normalized cross correlation (NCC) as the matching criteria. Additionally, the ability to match the respiratory phase of a 4.0 cm subvolume of the thorax to a reference subvolume using only a single overlapping 2D slice from the two subvolumes was tested by varying the location of the overlapping matching image within the subvolume and examining the effect this had on the displacement relative to the reference volume. The displacement uncertainty resulting from matching two respiratory images using NCC ranged from 0.54 ± 0.10 mm per match to 0.32 ± 0.16 mm per match in the lung of the animal. The uncertainty was found to propagate in quadrature, increasing with number of NCC matches performed. In comparison, the minimum displacement achievable if two respiratory images were matched perfectly in phase ranged from 0.77 ± 0.06 to 0.93 ± 0.06 mm in the lung. The assumption that subvolumes from separate cine scan could be matched by matching a single overlapping 2D image between to subvolumes was validated. An in vivo animal model was developed to test an image-based 4D-CT algorithm. The uncertainties associated with using NCC to match the respiratory phase of two images were quantified and the assumption that a 4.0 cm 3D subvolume can by matched in respiratory phase by matching a single 2D image from the 3D subvolume was validated. The work in this paper shows the image-based 4D

  16. Quantifying uncertainty due to internal variability using high-resolution regional climate model simulations

    Science.gov (United States)

    Gutmann, E. D.; Ikeda, K.; Deser, C.; Rasmussen, R.; Clark, M. P.; Arnold, J. R.

    2015-12-01

    The uncertainty in future climate predictions is as large or larger than the mean climate change signal. As such, any predictions of future climate need to incorporate and quantify the sources of this uncertainty. One of the largest sources comes from the internal, chaotic, variability within the climate system itself. This variability has been approximated using the 30 ensemble members of the Community Earth System Model (CESM) large ensemble. Here we examine the wet and dry end members of this ensemble for cool-season precipitation in the Colorado Rocky Mountains with a set of high-resolution regional climate model simulations. We have used the Weather Research and Forecasting model (WRF) to simulate the periods 1990-2000, 2025-2035, and 2070-2080 on a 4km grid. These simulations show that the broad patterns of change depicted in CESM are inherited by the high-resolution simulations; however, the differences in the height and location of the mountains in the WRF simulation, relative to the CESM simulation, means that the location and magnitude of the precipitation changes are very different. We further show that high-resolution simulations with the Intermediate Complexity Atmospheric Research model (ICAR) predict a similar spatial pattern in the change signal as WRF for these ensemble members. We then use ICAR to examine the rest of the CESM Large Ensemble as well as the uncertainty in the regional climate model due to the choice of physics parameterizations.

  17. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  18. Quantifying reactor safety margins: Part 1: An overview of the code scaling, applicability, and uncertainty evaluation methodology

    International Nuclear Information System (INIS)

    Boyack, B.E.; Duffey, R.B.; Griffith, P.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems (ECCS) entitled ''Emergency Core Cooling System; Revisions to Acceptance Criteria.'' The revised rule states an alternate ECCS performance analysis, based on best-estimate methods, may be used to provide more realistic estimates of plant safety margins, provided the licensee quantifies the uncertainty of the estimates and included that uncertainty when comparing the calculated results with prescribed acceptance limits. To support the revised ECCS rule, the NRC and its contractors and consultants have developed and demonstrated a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. It is an auditable, traceable, and practical method for combining quantitative analyses and expert opinions to arrive at computed values of uncertainty. This paper provides an overview of the CSAU evaluation methodology and its application to a postulated cold-leg, large-break loss-of-coolant accident in a Westinghouse four-loop pressurized water reactor with 17 /times/ 17 fuel. The code selected for this demonstration of the CSAU methodology was TRAC-PF1/MOD1, Version 14.3. 23 refs., 5 figs., 1 tab

  19. Quantifying Uncertainties in Mass-Dimensional Relationships Through a Comparison Between CloudSat and SPartICus Reflectivity Factors

    Science.gov (United States)

    Mascio, J.; Mace, G. G.

    2015-12-01

    CloudSat and CALIPSO, two of the satellites in the A-Train constellation, use algorithms to calculate the scattering properties of small cloud particles, such as the T-matrix method. Ice clouds (i.e. cirrus) cause problems with these cloud property retrieval algorithms because of their variability in ice mass as a function of particle size. Assumptions regarding the microphysical properties, such as mass-dimensional (m-D) relationships, are often necessary in retrieval algorithms for simplification, but these assumptions create uncertainties of their own. Therefore, ice cloud property retrieval uncertainties can be substantial and are often not well known. To investigate these uncertainties, reflectivity factors measured by CloudSat are compared to those calculated from particle size distributions (PSDs) to which different m-D relationships are applied. These PSDs are from data collected in situ during three flights of the Small Particles in Cirrus (SPartICus) campaign. We find that no specific habit emerges as preferred and instead we conclude that the microphysical characteristics of ice crystal populations tend to be distributed over a continuum and, therefore, cannot be categorized easily. To quantify the uncertainties in the mass-dimensional relationships, an optimal estimation inversion was run to retrieve the m-D relationship per SPartICus flight, as well as to calculate uncertainties of the m-D power law.

  20. Quantifying uncertainty in Gulf of Mexico forecasts stemming from uncertain initial conditions

    KAUST Repository

    Iskandarani, Mohamed

    2016-06-09

    Polynomial Chaos (PC) methods are used to quantify the impacts of initial conditions uncertainties on oceanic forecasts of the Gulf of Mexico circulation. Empirical Orthogonal Functions are used as initial conditions perturbations with their modal amplitudes considered as uniformly distributed uncertain random variables. These perturbations impact primarily the Loop Current system and several frontal eddies located in its vicinity. A small ensemble is used to sample the space of the modal amplitudes and to construct a surrogate for the evolution of the model predictions via a nonintrusive Galerkin projection. The analysis of the surrogate yields verification measures for the surrogate\\'s reliability and statistical information for the model output. A variance analysis indicates that the sea surface height predictability in the vicinity of the Loop Current is limited to about 20 days. © 2016. American Geophysical Union. All Rights Reserved.

  1. Quantifying and managing uncertainty in operational modal analysis

    Science.gov (United States)

    Au, Siu-Kui; Brownjohn, James M. W.; Mottershead, John E.

    2018-03-01

    Operational modal analysis aims at identifying the modal properties (natural frequency, damping, etc.) of a structure using only the (output) vibration response measured under ambient conditions. Highly economical and feasible, it is becoming a common practice in full-scale vibration testing. In the absence of (input) loading information, however, the modal properties have significantly higher uncertainty than their counterparts identified from free or forced vibration (known input) tests. Mastering the relationship between identification uncertainty and test configuration is of great interest to both scientists and engineers, e.g., for achievable precision limits and test planning/budgeting. Addressing this challenge beyond the current state-of-the-art that are mostly concerned with identification algorithms, this work obtains closed form analytical expressions for the identification uncertainty (variance) of modal parameters that fundamentally explains the effect of test configuration. Collectively referred as 'uncertainty laws', these expressions are asymptotically correct for well-separated modes, small damping and long data; and are applicable under non-asymptotic situations. They provide a scientific basis for planning and standardization of ambient vibration tests, where factors such as channel noise, sensor number and location can be quantitatively accounted for. The work is reported comprehensively with verification through synthetic and experimental data (laboratory and field), scientific implications and practical guidelines for planning ambient vibration tests.

  2. Towards quantifying uncertainty in predictions of Amazon 'dieback'.

    Science.gov (United States)

    Huntingford, Chris; Fisher, Rosie A; Mercado, Lina; Booth, Ben B B; Sitch, Stephen; Harris, Phil P; Cox, Peter M; Jones, Chris D; Betts, Richard A; Malhi, Yadvinder; Harris, Glen R; Collins, Mat; Moorcroft, Paul

    2008-05-27

    Simulations with the Hadley Centre general circulation model (HadCM3), including carbon cycle model and forced by a 'business-as-usual' emissions scenario, predict a rapid loss of Amazonian rainforest from the middle of this century onwards. The robustness of this projection to both uncertainty in physical climate drivers and the formulation of the land surface scheme is investigated. We analyse how the modelled vegetation cover in Amazonia responds to (i) uncertainty in the parameters specified in the atmosphere component of HadCM3 and their associated influence on predicted surface climate. We then enhance the land surface description and (ii) implement a multilayer canopy light interception model and compare with the simple 'big-leaf' approach used in the original simulations. Finally, (iii) we investigate the effect of changing the method of simulating vegetation dynamics from an area-based model (TRIFFID) to a more complex size- and age-structured approximation of an individual-based model (ecosystem demography). We find that the loss of Amazonian rainforest is robust across the climate uncertainty explored by perturbed physics simulations covering a wide range of global climate sensitivity. The introduction of the refined light interception model leads to an increase in simulated gross plant carbon uptake for the present day, but, with altered respiration, the net effect is a decrease in net primary productivity. However, this does not significantly affect the carbon loss from vegetation and soil as a consequence of future simulated depletion in soil moisture; the Amazon forest is still lost. The introduction of the more sophisticated dynamic vegetation model reduces but does not halt the rate of forest dieback. The potential for human-induced climate change to trigger the loss of Amazon rainforest appears robust within the context of the uncertainties explored in this paper. Some further uncertainties should be explored, particularly with respect to the

  3. Different methodologies to quantify uncertainties of air emissions.

    Science.gov (United States)

    Romano, Daniela; Bernetti, Antonella; De Lauretis, Riccardo

    2004-10-01

    Characterization of the uncertainty associated with air emission estimates is of critical importance especially in the compilation of air emission inventories. In this paper, two different theories are discussed and applied to evaluate air emissions uncertainty. In addition to numerical analysis, which is also recommended in the framework of the United Nation Convention on Climate Change guidelines with reference to Monte Carlo and Bootstrap simulation models, fuzzy analysis is also proposed. The methodologies are discussed and applied to an Italian example case study. Air concentration values are measured from two electric power plants: a coal plant, consisting of two boilers and a fuel oil plant, of four boilers; the pollutants considered are sulphur dioxide (SO(2)), nitrogen oxides (NO(X)), carbon monoxide (CO) and particulate matter (PM). Monte Carlo, Bootstrap and fuzzy methods have been applied to estimate uncertainty of these data. Regarding Monte Carlo, the most accurate results apply to Gaussian distributions; a good approximation is also observed for other distributions with almost regular features either positive asymmetrical or negative asymmetrical. Bootstrap, on the other hand, gives a good uncertainty estimation for irregular and asymmetrical distributions. The logic of fuzzy analysis, where data are represented as vague and indefinite in opposition to the traditional conception of neatness, certain classification and exactness of the data, follows a different description. In addition to randomness (stochastic variability) only, fuzzy theory deals with imprecision (vagueness) of data. Fuzzy variance of the data set was calculated; the results cannot be directly compared with empirical data but the overall performance of the theory is analysed. Fuzzy theory may appear more suitable for qualitative reasoning than for a quantitative estimation of uncertainty, but it suits well when little information and few measurements are available and when

  4. Quantifying uncertainty in NDSHA estimates due to earthquake catalogue

    Science.gov (United States)

    Magrin, Andrea; Peresan, Antonella; Vaccari, Franco; Panza, Giuliano

    2014-05-01

    The procedure for the neo-deterministic seismic zoning, NDSHA, is based on the calculation of synthetic seismograms by the modal summation technique. This approach makes use of information about the space distribution of large magnitude earthquakes, which can be defined based on seismic history and seismotectonics, as well as incorporating information from a wide set of geological and geophysical data (e.g., morphostructural features and ongoing deformation processes identified by earth observations). Hence the method does not make use of attenuation models (GMPE), which may be unable to account for the complexity of the product between seismic source tensor and medium Green function and are often poorly constrained by the available observations. NDSHA defines the hazard from the envelope of the values of ground motion parameters determined considering a wide set of scenario earthquakes; accordingly, the simplest outcome of this method is a map where the maximum of a given seismic parameter is associated to each site. In NDSHA uncertainties are not statistically treated as in PSHA, where aleatory uncertainty is traditionally handled with probability density functions (e.g., for magnitude and distance random variables) and epistemic uncertainty is considered by applying logic trees that allow the use of alternative models and alternative parameter values of each model, but the treatment of uncertainties is performed by sensitivity analyses for key modelling parameters. To fix the uncertainty related to a particular input parameter is an important component of the procedure. The input parameters must account for the uncertainty in the prediction of fault radiation and in the use of Green functions for a given medium. A key parameter is the magnitude of sources used in the simulation that is based on catalogue informations, seismogenic zones and seismogenic nodes. Because the largest part of the existing catalogues is based on macroseismic intensity, a rough estimate

  5. The Uncertainty Multiplier and Business Cycles

    OpenAIRE

    Saijo, Hikaru

    2013-01-01

    I study a business cycle model where agents learn about the state of the economy by accumulating capital. During recessions, agents invest less, and this generates noisier estimates of macroeconomic conditions and an increase in uncertainty. The endogenous increase in aggregate uncertainty further reduces economic activity, which in turn leads to more uncertainty, and so on. Thus, through changes in uncertainty, learning gives rise to a multiplier effect that amplifies business cycles. I use ...

  6. Quantifying Surface Energy Flux Estimation Uncertainty Using Land Surface Temperature Observations

    Science.gov (United States)

    French, A. N.; Hunsaker, D.; Thorp, K.; Bronson, K. F.

    2015-12-01

    Remote sensing with thermal infrared is widely recognized as good way to estimate surface heat fluxes, map crop water use, and detect water-stressed vegetation. When combined with net radiation and soil heat flux data, observations of sensible heat fluxes derived from surface temperatures (LST) are indicative of instantaneous evapotranspiration (ET). There are, however, substantial reasons LST data may not provide the best way to estimate of ET. For example, it is well known that observations and models of LST, air temperature, or estimates of transport resistances may be so inaccurate that physically based model nevertheless yield non-meaningful results. Furthermore, using visible and near infrared remote sensing observations collected at the same time as LST often yield physically plausible results because they are constrained by less dynamic surface conditions such as green fractional cover. Although sensitivity studies exist that help identify likely sources of error and uncertainty, ET studies typically do not provide a way to assess the relative importance of modeling ET with and without LST inputs. To better quantify model benefits and degradations due to LST observational inaccuracies, a Bayesian uncertainty study was undertaken using data collected in remote sensing experiments at Maricopa, Arizona. Visible, near infrared and thermal infrared data were obtained from an airborne platform. The prior probability distribution of ET estimates were modeled using fractional cover, local weather data and a Penman-Monteith mode, while the likelihood of LST data was modeled from a two-source energy balance model. Thus the posterior probabilities of ET represented the value added by using LST data. Results from an ET study over cotton grown in 2014 and 2015 showed significantly reduced ET confidence intervals when LST data were incorporated.

  7. Potential Carbon Transport: Linking Soil Aggregate Stability and Sediment Enrichment for Updating the Soil Active Layer within Intensely Managed Landscapes

    Science.gov (United States)

    Wacha, K.; Papanicolaou, T.; Abban, B. K.; Wilson, C. G.

    2014-12-01

    Currently, many biogeochemical models lack the mechanistic capacity to accurately simulate soil organic carbon (SOC) dynamics, especially within intensely managed landscapes (IMLs) such as those found in the U.S. Midwest. These modeling limitations originate by not accounting for downslope connectivity of flowpathways initiated and governed by landscape processes and hydrologic forcing, which induce dynamic updates to the soil active layer (generally top 20-30cm of soil) with various sediment size fractions and aggregates being transported and deposited along the downslope. These hydro-geomorphic processes, often amplified in IMLs by tillage events and seasonal canopy, can greatly impact biogeochemical cycles (e.g., enhanced mineralization during aggregate breakdown) and in turn, have huge implications/uncertainty when determining SOC budgets. In this study, some of these limitations were addressed through a new concept, Potential Carbon Transport (PCT), a term which quantifies a maximum amount of material available for transport at various positions of the landscape, which was used to further refine a coupled modeling framework focused on SOC redistribution through downslope/lateral connectivity. Specifically, the size fractions slaked from large and small aggregates during raindrop-induced aggregate stability tests were used in conjunction with rainfall-simulated sediment enrichment ratio (ER) experiments to quantify the PCT under various management practices, soil types and landscape positions. Field samples used in determining aggregate stability and the ER experiments were collected/performed within the historic Clear Creek Watershed, home of the IML Critical Zone Observatory, located in Southeastern Iowa.

  8. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  9. Method for quantifying the uncertainty with the extraction of the raw data of a gamma ray spectrum by deconvolution software

    International Nuclear Information System (INIS)

    Vigineix, Thomas; Guillot, Nicolas; Saurel, Nicolas

    2013-06-01

    Gamma ray spectrometry is a passive non destructive assay most commonly used to identify and quantify the radionuclides present in complex huge objects such as nuclear waste packages. The treatment of spectra from the measurement of nuclear waste is done in two steps: the first step is to extract the raw data from the spectra (energies and the net photoelectric absorption peaks area) and the second step is to determine the detection efficiency of the measuring scene. Commercial software use different methods to extract the raw data spectrum but none are optimal in the treatment of spectra containing actinides. Spectra should be handled individually and requires settings and an important feedback part from the operator, which prevents the automatic process of spectrum and increases the risk of human error. In this context the Nuclear Measurement and Valuation Laboratory (LMNE) in the Atomic Energy Commission Valduc (CEA Valduc) has developed a new methodology for quantifying the uncertainty associated with the extraction of the raw data over spectrum. This methodology was applied with raw data and commercial software that need configuration by the operator (GENIE2000, Interwinner...). This robust and fully automated methodology of uncertainties calculation is performed on the entire process of the software. The methodology ensures for all peaks processed by the deconvolution software an extraction of energy peaks closed to 2 channels and an extraction of net areas with an uncertainty less than 5 percents. The methodology was tested experimentally with actinides spectrum. (authors)

  10. Quantifying uncertainty in coral Sr/Ca-based SST estimates from Orbicella faveolata: A basis for multi-colony SST reconstructions

    Science.gov (United States)

    Richey, J. N.; Flannery, J. A.; Toth, L. T.; Kuffner, I. B.; Poore, R. Z.

    2017-12-01

    The Sr/Ca in massive corals can be used as a proxy for sea surface temperature (SST) in shallow tropical to sub-tropical regions; however, the relationship between Sr/Ca and SST varies throughout the ocean, between different species of coral, and often between different colonies of the same species. We aimed to quantify the uncertainty associated with the Sr/Ca-SST proxy due to sample handling (e.g., micro-drilling or analytical error), vital effects (e.g., among-colony differences in coral growth), and local-scale variability in microhabitat. We examine the intra- and inter-colony reproducibility of Sr/Ca records extracted from five modern Orbicella faveolata colonies growing in the Dry Tortugas, Florida, USA. The average intra-colony absolute difference (AD) in Sr/Ca of the five colonies during an overlapping interval (1997-2008) was 0.055 ± 0.044 mmol mol-1 (0.96 ºC) and the average inter-colony Sr/Ca AD was 0.039 ± 0.01 mmol mol-1 (0.51 ºC). All available Sr/Ca-SST data pairs from 1997-2008 were combined and regressed against the HadISST1 gridded SST data set (24 ºN and 82 ºW) to produce a calibration equation that could be applied to O. faveolata specimens from throughout the Gulf of Mexico/Caribbean/Atlantic region after accounting for the potential uncertainties in Sr/Ca-derived SSTs. We quantified a combined error term for O. faveolata using the root-sum-square (RMS) of the analytical, intra-, and inter-colony uncertainties and suggest that an overall uncertainty of 0.046 mmol mol-1 (0.81 ºC, 1σ), should be used to interpret Sr/Ca records from O. faveolata specimens of unknown age or origin to reconstruct SST. We also explored how uncertainty is affected by the number of corals used in a reconstruction by iteratively calculating the RMS error for composite coral time-series using two, three, four, and five overlapping coral colonies. Our results indicate that maximum RMS error at the 95% confidence interval on mean annual SST estimates is 1.4 º

  11. Quantifying geomorphic change and characterizing uncertainty in repeat aerial lidar over an enormous area: Blue Earth County, MN

    Science.gov (United States)

    Schaffrath, K. R.; Belmont, P.; Wheaton, J. M.

    2013-12-01

    High-resolution topography data (lidar) are being collected over increasingly larger geographic areas. These data contain an immense amount of information regarding the topography of bare-earth and vegetated surfaces. Repeat lidar data (collected at multiple times for the same location) enables extraction of an unprecedented level of detailed information about landscape form and function and provides an opportunity to quantify volumetric change and identify hot spots of erosion and deposition. However, significant technological and scientific challenges remain in the analysis of repeat lidar data over enormous areas (>1000 square kilometers), not the least of which involves robust quantification of uncertainty. Excessive sedimentation has been documented in the Minnesota River and many reaches of the mainstem and tributaries are listed as impaired for turbidity and eutrophication under the Clean Water Act of 1972. The Blue Earth River and its tributaries (Greater Blue Earth basin) have been identified as one of the main sources of sediment to the Minnesota River. Much of the Greater Blue Earth basin is located in Blue Earth County (1,982 square kilometers) where airborne lidar data were collected in 2005 and 2012, with average bare-earth point densities of 1 point per square meter and closer to 2 points per square meter, respectively. One of the largest floods on record (100-year recurrence interval) occurred in September 2010. A sediment budget for the Greater Blue Earth basin is being developed to inform strategies to reduce current sediment loads and better predict how the basin may respond to changing climate and management practices. Here we evaluate the geomorphic changes that occurred between 2005 and 2012 to identify hotspots of erosion and deposition, and to quantify some of the terms in the sediment budget. To make meaningful interpretations of the differences between the 2005 and 2012 lidar digital elevation models (DEMs), total uncertainty must be

  12. Quantifying uncertainty in the measurement of arsenic in suspended particulate matter by Atomic Absorption Spectrometry with hydride generator

    Directory of Open Access Journals (Sweden)

    Ahuja Tarushee

    2011-04-01

    Full Text Available Abstract Arsenic is the toxic element, which creates several problems in human being specially when inhaled through air. So the accurate and precise measurement of arsenic in suspended particulate matter (SPM is of prime importance as it gives information about the level of toxicity in the environment, and preventive measures could be taken in the effective areas. Quality assurance is equally important in the measurement of arsenic in SPM samples before making any decision. The quality and reliability of the data of such volatile elements depends upon the measurement of uncertainty of each step involved from sampling to analysis. The analytical results quantifying uncertainty gives a measure of the confidence level of the concerned laboratory. So the main objective of this study was to determine arsenic content in SPM samples with uncertainty budget and to find out various potential sources of uncertainty, which affects the results. Keeping these facts, we have selected seven diverse sites of Delhi (National Capital of India for quantification of arsenic content in SPM samples with uncertainty budget following sampling by HVS to analysis by Atomic Absorption Spectrometer-Hydride Generator (AAS-HG. In the measurement of arsenic in SPM samples so many steps are involved from sampling to final result and we have considered various potential sources of uncertainties. The calculation of uncertainty is based on ISO/IEC17025: 2005 document and EURACHEM guideline. It has been found that the final results mostly depend on the uncertainty in measurement mainly due to repeatability, final volume prepared for analysis, weighing balance and sampling by HVS. After the analysis of data of seven diverse sites of Delhi, it has been concluded that during the period from 31st Jan. 2008 to 7th Feb. 2008 the arsenic concentration varies from 1.44 ± 0.25 to 5.58 ± 0.55 ng/m3 with 95% confidence level (k = 2.

  13. Ways forward in quantifying data uncertainty in geological databases

    Science.gov (United States)

    Kint, Lars; Chademenos, Vasileios; De Mol, Robin; Kapel, Michel; Lagring, Ruth; Stafleu, Jan; van Heteren, Sytze; Van Lancker, Vera

    2017-04-01

    Issues of compatibility of geological data resulting from the merging of many different data sources and time periods may jeopardize harmonization of data products. Important progress has been made due to increasing data standardization, e.g., at a European scale through the SeaDataNet and Geo-Seas data management infrastructures. Common geological data standards are unambiguously defined, avoiding semantic overlap in geological data and associated metadata. Quality flagging is also applied increasingly, though ways in further propagating this information in data products is still at its infancy. For the Belgian and southern Netherlands part of the North Sea, databases are now rigorously re-analyzed in view of quantifying quality flags in terms of uncertainty to be propagated through a 3D voxel model of the subsurface (https://odnature.naturalsciences.be/tiles/). An approach is worked out to consistently account for differences in positioning, sampling gear, analysis procedures and vintage. The flag scaling is used in the interpolation process of geological data, but will also be used when visualizing the suitability of geological resources in a decision support system. Expert knowledge is systematically revisited as to avoid totally inappropriate use of the flag scaling process. The quality flagging is also important when communicating results to end-users. Therefore, an open data policy in combination with several processing tools will be at the heart of a new Belgian geological data portal as a platform for knowledge building (KB) and knowledge management (KM) serving the marine geoscience, the policy community and the public at large.

  14. Do oil shocks predict economic policy uncertainty?

    Science.gov (United States)

    Rehman, Mobeen Ur

    2018-05-01

    Oil price fluctuations have influential role in global economic policies for developed as well as emerging countries. I investigate the role of international oil prices disintegrated into structural (i) oil supply shock, (ii) aggregate demand shock and (iii) oil market specific demand shocks, based on the work of Kilian (2009) using structural VAR framework on economic policies uncertainty of sampled markets. Economic policy uncertainty, due to its non-linear behavior is modeled in a regime switching framework with disintegrated structural oil shocks. Our results highlight that Indian, Spain and Japanese economic policy uncertainty responds to the global oil price shocks, however aggregate demand shocks fail to induce any change. Oil specific demand shocks are significant only for China and India in high volatility state.

  15. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cooley, Scott K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kuhn, William L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rector, David R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Heredia-Langner, Alejandro [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to “address uncertainties and increase confidence in the projected, full-scale mixing performance and operations” in the Waste Treatment and Immobilization Plant (WTP).

  16. Harnessing the theoretical foundations of the exponential and beta-Poisson dose-response models to quantify parameter uncertainty using Markov Chain Monte Carlo.

    Science.gov (United States)

    Schmidt, Philip J; Pintar, Katarina D M; Fazil, Aamir M; Topp, Edward

    2013-09-01

    Dose-response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose-response model parameters are estimated using limited epidemiological data is rarely quantified. Second-order risk characterization approaches incorporating uncertainty in dose-response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta-Poisson dose-response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta-Poisson dose-response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta-Poisson dose-response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta-Poisson model are proposed, and simple algorithms to evaluate actual beta-Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta-Poisson dose-response model parameters is attributable to the absence of low-dose data. This region includes beta-Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility. © Her Majesty the Queen in Right of Canada 2013. Reproduced with the permission of the Minister of the Public Health Agency of Canada.

  17. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  18. Uncertainties in fission-product decay-heat calculations

    Energy Technology Data Exchange (ETDEWEB)

    Oyamatsu, K.; Ohta, H.; Miyazono, T.; Tasaka, K. [Nagoya Univ. (Japan)

    1997-03-01

    The present precision of the aggregate decay heat calculations is studied quantitatively for 50 fissioning systems. In this evaluation, nuclear data and their uncertainty data are taken from ENDF/B-VI nuclear data library and those which are not available in this library are supplemented by a theoretical consideration. An approximate method is proposed to simplify the evaluation of the uncertainties in the aggregate decay heat calculations so that we can point out easily nuclei which cause large uncertainties in the calculated decay heat values. In this paper, we attempt to clarify the justification of the approximation which was not very clear at the early stage of the study. We find that the aggregate decay heat uncertainties for minor actinides such as Am and Cm isotopes are 3-5 times as large as those for {sup 235}U and {sup 239}Pu. The recommended values by Atomic Energy Society of Japan (AESJ) were given for 3 major fissioning systems, {sup 235}U(t), {sup 239}Pu(t) and {sup 238}U(f). The present results are consistent with the AESJ values for these systems although the two evaluations used different nuclear data libraries and approximations. Therefore, the present results can also be considered to supplement the uncertainty values for the remaining 17 fissioning systems in JNDC2, which were not treated in the AESJ evaluation. Furthermore, we attempt to list nuclear data which cause large uncertainties in decay heat calculations for the future revision of decay and yield data libraries. (author)

  19. Delayed neutron spectra and their uncertainties in fission product summation calculations

    Energy Technology Data Exchange (ETDEWEB)

    Miyazono, T.; Sagisaka, M.; Ohta, H.; Oyamatsu, K.; Tamaki, M. [Nagoya Univ. (Japan)

    1997-03-01

    Uncertainties in delayed neutron summation calculations are evaluated with ENDF/B-VI for 50 fissioning systems. As the first step, uncertainty calculations are performed for the aggregate delayed neutron activity with the same approximate method as proposed previously for the decay heat uncertainty analyses. Typical uncertainty values are about 6-14% for {sup 238}U(F) and about 13-23% for {sup 243}Am(F) at cooling times 0.1-100 (s). These values are typically 2-3 times larger than those in decay heat at the same cooling times. For aggregate delayed neutron spectra, the uncertainties would be larger than those for the delayed neutron activity because much more information about the nuclear structure is still necessary. (author)

  20. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    International Nuclear Information System (INIS)

    Piepel, Gregory F.; Cooley, Scott K.; Kuhn, William L.; Rector, David R.; Heredia-Langner, Alejandro

    2015-01-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to ''address uncertainties and increase confidence in the projected, full-scale mixing performance and operations'' in the Waste Treatment and Immobilization Plant (WTP).

  1. Quantification of uncertainties of modeling and simulation

    International Nuclear Information System (INIS)

    Ma Zhibo; Yin Jianwei

    2012-01-01

    The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)

  2. Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit

    International Nuclear Information System (INIS)

    Tarantola, S.; Saltelli, A.; Draper, D.

    1999-01-01

    In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes

  3. Quantifying uncertainties in wind energy assessment

    Science.gov (United States)

    Patlakas, Platon; Galanis, George; Kallos, George

    2015-04-01

    The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.

  4. Forecast Accuracy Uncertainty and Momentum

    OpenAIRE

    Bing Han; Dong Hong; Mitch Warachka

    2009-01-01

    We demonstrate that stock price momentum and earnings momentum can result from uncertainty surrounding the accuracy of cash flow forecasts. Our model has multiple information sources issuing cash flow forecasts for a stock. The investor combines these forecasts into an aggregate cash flow estimate that has minimal mean-squared forecast error. This aggregate estimate weights each cash flow forecast by the estimated accuracy of its issuer, which is obtained from their past forecast errors. Mome...

  5. Analysis of uncertainty in modeling perceived risks

    International Nuclear Information System (INIS)

    Melnyk, R.; Sandquist, G.M.

    2005-01-01

    Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)

  6. A framework to quantify uncertainties of seafloor backscatter from swath mapping echosounders

    Science.gov (United States)

    Malik, Mashkoor; Lurton, Xavier; Mayer, Larry

    2018-06-01

    Multibeam echosounders (MBES) have become a widely used acoustic remote sensing tool to map and study the seafloor, providing co-located bathymetry and seafloor backscatter. Although the uncertainty associated with MBES-derived bathymetric data has been studied extensively, the question of backscatter uncertainty has been addressed only minimally and hinders the quantitative use of MBES seafloor backscatter. This paper explores approaches to identifying uncertainty sources associated with MBES-derived backscatter measurements. The major sources of uncertainty are catalogued and the magnitudes of their relative contributions to the backscatter uncertainty budget are evaluated. These major uncertainty sources include seafloor insonified area (1-3 dB), absorption coefficient (up to > 6 dB), random fluctuations in echo level (5.5 dB for a Rayleigh distribution), and sonar calibration (device dependent). The magnitudes of these uncertainty sources vary based on how these effects are compensated for during data acquisition and processing. Various cases (no compensation, partial compensation and full compensation) for seafloor insonified area, transmission losses and random fluctuations were modeled to estimate their uncertainties in different scenarios. Uncertainty related to the seafloor insonified area can be reduced significantly by accounting for seafloor slope during backscatter processing while transmission losses can be constrained by collecting full water column absorption coefficient profiles (temperature and salinity profiles). To reduce random fluctuations to below 1 dB, at least 20 samples are recommended to be used while computing mean values. The estimation of uncertainty in backscatter measurements is constrained by the fact that not all instrumental components are characterized and documented sufficiently for commercially available MBES. Further involvement from manufacturers in providing this essential information is critically required.

  7. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  8. Mechanisms and rates of bacterial colonization of sinking aggregates

    DEFF Research Database (Denmark)

    Kiørboe, Thomas; Grossart, H.P.; Ploug, H.

    2002-01-01

    Quantifying the rate at which bacteria colonize aggregates is a key to understanding microbial turnover of aggregates. We used encounter models based on random walk and advection-diffusion considerations to predict colonization rates from the bacteria's motility patterns (swimming speed, tumbling...

  9. Aggregate Measures of Watershed Health from Reconstructed ...

    Science.gov (United States)

    Risk-based indices such as reliability, resilience, and vulnerability (R-R-V), have the potential to serve as watershed health assessment tools. Recent research has demonstrated the applicability of such indices for water quality (WQ) constituents such as total suspended solids and nutrients on an individual basis. However, the calculations can become tedious when time-series data for several WQ constituents have to be evaluated individually. Also, comparisons between locations with different sets of constituent data can prove difficult. In this study, data reconstruction using relevance vector machine algorithm was combined with dimensionality reduction via variational Bayesian noisy principal component analysis to reconstruct and condense sparse multidimensional WQ data sets into a single time series. The methodology allows incorporation of uncertainty in both the reconstruction and dimensionality-reduction steps. The R-R-V values were calculated using the aggregate time series at multiple locations within two Indiana watersheds. Results showed that uncertainty present in the reconstructed WQ data set propagates to the aggregate time series and subsequently to the aggregate R-R-V values as well. serving as motivating examples. Locations with different WQ constituents and different standards for impairment were successfully combined to provide aggregate measures of R-R-V values. Comparisons with individual constituent R-R-V values showed that v

  10. Aiding alternatives assessment with an uncertainty-tolerant hazard scoring method.

    Science.gov (United States)

    Faludi, Jeremy; Hoang, Tina; Gorman, Patrick; Mulvihill, Martin

    2016-11-01

    This research developed a single-score system to simplify and clarify decision-making in chemical alternatives assessment, accounting for uncertainty. Today, assessing alternatives to hazardous constituent chemicals is a difficult task-rather than comparing alternatives by a single definitive score, many independent toxicological variables must be considered at once, and data gaps are rampant. Thus, most hazard assessments are only comprehensible to toxicologists, but business leaders and politicians need simple scores to make decisions. In addition, they must balance hazard against other considerations, such as product functionality, and they must be aware of the high degrees of uncertainty in chemical hazard data. This research proposes a transparent, reproducible method to translate eighteen hazard endpoints into a simple numeric score with quantified uncertainty, alongside a similar product functionality score, to aid decisions between alternative products. The scoring method uses Clean Production Action's GreenScreen as a guide, but with a different method of score aggregation. It provides finer differentiation between scores than GreenScreen's four-point scale, and it displays uncertainty quantitatively in the final score. Displaying uncertainty also illustrates which alternatives are early in product development versus well-defined commercial products. This paper tested the proposed assessment method through a case study in the building industry, assessing alternatives to spray polyurethane foam insulation containing methylene diphenyl diisocyanate (MDI). The new hazard scoring method successfully identified trade-offs between different alternatives, showing finer resolution than GreenScreen Benchmarking. Sensitivity analysis showed that different weighting schemes in hazard scores had almost no effect on alternatives ranking, compared to uncertainty from data gaps. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Aggregation and pH-temperature phase behavior for aggregates of an IgG2 antibody.

    Science.gov (United States)

    Sahin, Erinc; Weiss, William F; Kroetsch, Andrew M; King, Kevin R; Kessler, R Kendall; Das, Tapan K; Roberts, Christopher J

    2012-05-01

    Monomer unfolding and thermally accelerated aggregation kinetics to produce soluble oligomers or insoluble macroscopic aggregates were characterized as a function of pH for an IgG2 antibody using differential scanning calorimetry (DSC) and size-exclusion chromatography (SEC). Aggregate size was quantified via laser light scattering, and aggregate solubility via turbidity and visual inspection. Interestingly, nonnative oligomers were soluble at pH 5.5 above approximately 15°C, but converted reversibly to visible/insoluble particles at lower temperatures. Lower pH values yielded only soluble aggregates, whereas higher pH resulted in insoluble aggregates, regardless of the solution temperature. Unlike the growing body of literature that supports the three-endotherm model of IgG1 unfolding in DSC, the results here also illustrate limitations of that model for other monoclonal antibodies. Comparison of DSC with monomer loss (via SEC) from samples during thermal scanning indicates that the least conformationally stable domain is not the most aggregation prone, and that a number of the domains remain intact within the constituent monomers of the resulting aggregates. This highlights continued challenges with predicting a priori which domain(s) or thermal transition(s) is(are) most relevant for product stability with respect to aggregation. Copyright © 2012 Wiley Periodicals, Inc.

  12. Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty

    Science.gov (United States)

    Mather, Janice L.; Taylor, Shawn C.

    2015-01-01

    In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.

  13. Quantifying data worth toward reducing predictive uncertainty

    Science.gov (United States)

    Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C.

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement. Journal compilation ?? 2010 National Ground Water Association.

  14. Forecasting Uncertainty in Electricity Smart Meter Data by Boosting Additive Quantile Regression

    KAUST Repository

    Taieb, Souhaib Ben

    2016-03-02

    Smart electricity meters are currently deployed in millions of households to collect detailed individual electricity consumption data. Compared with traditional electricity data based on aggregated consumption, smart meter data are much more volatile and less predictable. There is a need within the energy industry for probabilistic forecasts of household electricity consumption to quantify the uncertainty of future electricity demand in order to undertake appropriate planning of generation and distribution. We propose to estimate an additive quantile regression model for a set of quantiles of the future distribution using a boosting procedure. By doing so, we can benefit from flexible and interpretable models, which include an automatic variable selection. We compare our approach with three benchmark methods on both aggregated and disaggregated scales using a smart meter data set collected from 3639 households in Ireland at 30-min intervals over a period of 1.5 years. The empirical results demonstrate that our approach based on quantile regression provides better forecast accuracy for disaggregated demand, while the traditional approach based on a normality assumption (possibly after an appropriate Box-Cox transformation) is a better approximation for aggregated demand. These results are particularly useful since more energy data will become available at the disaggregated level in the future.

  15. Quantifying uncertainty in observational rainfall datasets

    Science.gov (United States)

    Lennard, Chris; Dosio, Alessandro; Nikulin, Grigory; Pinto, Izidine; Seid, Hussen

    2015-04-01

    The CO-ordinated Regional Downscaling Experiment (CORDEX) has to date seen the publication of at least ten journal papers that examine the African domain during 2012 and 2013. Five of these papers consider Africa generally (Nikulin et al. 2012, Kim et al. 2013, Hernandes-Dias et al. 2013, Laprise et al. 2013, Panitz et al. 2013) and five have regional foci: Tramblay et al. (2013) on Northern Africa, Mariotti et al. (2014) and Gbobaniyi el al. (2013) on West Africa, Endris et al. (2013) on East Africa and Kalagnoumou et al. (2013) on southern Africa. There also are a further three papers that the authors know about under review. These papers all use an observed rainfall and/or temperature data to evaluate/validate the regional model output and often proceed to assess projected changes in these variables due to climate change in the context of these observations. The most popular reference rainfall data used are the CRU, GPCP, GPCC, TRMM and UDEL datasets. However, as Kalagnoumou et al. (2013) point out there are many other rainfall datasets available for consideration, for example, CMORPH, FEWS, TAMSAT & RIANNAA, TAMORA and the WATCH & WATCH-DEI data. They, with others (Nikulin et al. 2012, Sylla et al. 2012) show that the observed datasets can have a very wide spread at a particular space-time coordinate. As more ground, space and reanalysis-based rainfall products become available, all which use different methods to produce precipitation data, the selection of reference data is becoming an important factor in model evaluation. A number of factors can contribute to a uncertainty in terms of the reliability and validity of the datasets such as radiance conversion algorithims, the quantity and quality of available station data, interpolation techniques and blending methods used to combine satellite and guage based products. However, to date no comprehensive study has been performed to evaluate the uncertainty in these observational datasets. We assess 18 gridded

  16. Quantified Uncertainties in Comparative Life Cycle Assessment : What Can Be Concluded?

    NARCIS (Netherlands)

    Mendoza Beltran, Angelica; Prado, Valentina; Font Vivanco, David; Henriksson, Patrik J.G.; Guinée, Jeroen B.; Heijungs, Reinout

    2018-01-01

    Interpretation of comparative Life Cycle Assessment (LCA) results can be challenging in the presence of uncertainty. To aid in interpreting such results under the goal of any comparative LCA, we aim to provide guidance to practitioners by gaining insights into uncertainty-statistics methods (USMs).

  17. Aggregation in environmental systems - Part 1: Seasonal tracer cycles quantify young water fractions, but not mean transit times, in spatially heterogeneous catchments

    Science.gov (United States)

    Kirchner, J. W.

    2016-01-01

    Environmental heterogeneity is ubiquitous, but environmental systems are often analyzed as if they were homogeneous instead, resulting in aggregation errors that are rarely explored and almost never quantified. Here I use simple benchmark tests to explore this general problem in one specific context: the use of seasonal cycles in chemical or isotopic tracers (such as Cl-, δ18O, or δ2H) to estimate timescales of storage in catchments. Timescales of catchment storage are typically quantified by the mean transit time, meaning the average time that elapses between parcels of water entering as precipitation and leaving again as streamflow. Longer mean transit times imply greater damping of seasonal tracer cycles. Thus, the amplitudes of tracer cycles in precipitation and streamflow are commonly used to calculate catchment mean transit times. Here I show that these calculations will typically be wrong by several hundred percent, when applied to catchments with realistic degrees of spatial heterogeneity. This aggregation bias arises from the strong nonlinearity in the relationship between tracer cycle amplitude and mean travel time. I propose an alternative storage metric, the young water fraction in streamflow, defined as the fraction of runoff with transit times of less than roughly 0.2 years. I show that this young water fraction (not to be confused with event-based "new water" in hydrograph separations) is accurately predicted by seasonal tracer cycles within a precision of a few percent, across the entire range of mean transit times from almost zero to almost infinity. Importantly, this relationship is also virtually free from aggregation error. That is, seasonal tracer cycles also accurately predict the young water fraction in runoff from highly heterogeneous mixtures of subcatchments with strongly contrasting transit-time distributions. Thus, although tracer cycle amplitudes yield biased and unreliable estimates of catchment mean travel times in heterogeneous

  18. Uncertainty of block estimates introduced by mis-allocation of point samples: on the example of spatial indoor Radon data

    Energy Technology Data Exchange (ETDEWEB)

    Bossew, P. [European Commission (Ecuador), Joint Research Centre (JRC), Institute for Environment and Sustainability (IES), TP441, Via Fermi 1, I-21020 Ispra (Italy)], E-mail: peter.bossew@jrc.it

    2009-03-15

    The European indoor Radon map which is currently under production is based on gridded data supplied by the contributing countries. Each grid node represents the arithmetic mean (among other statistics) of the individual measurements within 10 x 10 km{sup 2}, called cells, pixels or blocks, which are aligned to a common metric coordinate system. During work the question emerged, if uncertainty in the geo-referencing of individual data might affect the aggregated 'block' statistics to an extent that the statistics have an unpredictably high additional uncertainty, which makes them unusable. In this note we try to quantify the effect, based on simulations. The overall result is that the relevant statistics should not be affected too badly in most cases, in particular if the rate of mis-allocations, and the mean uncertainty of coordinates are not too high, so that also cell statistics which are to some degree distorted by mis-allocated data, can still be used for the purpose of the European Radon map.

  19. Quantifying measurement uncertainties in ADCP measurements in non-steady, inhomogeneous flow

    Science.gov (United States)

    Schäfer, Stefan

    2017-04-01

    The author presents a laboratory study of fixed-platform four-beam ADCP and three-beam ADV measurements in the tailrace of a micro hydro power setup with a 35kW Kaplan-turbine and 2.5m head. The datasets discussed quantify measurement uncertainties of the ADCP measurement technique coming from non-steady, inhomogeneous flow. For constant discharge of 1.5m3/s, two different flow scenarios were investigated: one being the regular tailrace flow downstream the draft tube and the second being a straightened, less inhomogeneous flow, which was generated by the use of a flow straightening device: A rack of diameter 40mm pipe sections was mounted right behind the draft tube. ADCP measurements (sampling rate 1.35Hz) were conducted in three distances behind the draft tube and compared bin-wise to measurements of three simultaneously measuring ADV probes (sampling rate 64Hz). The ADV probes were aligned horizontally and the ADV bins were placed in the centers of two facing ADCP bins and in the vertical under the ADCP probe of the corresponding depth. Rotating the ADV probes by 90° allowed for measurements of the other two facing ADCP bins. For reasons of mutual probe interaction, ADCP and ADV measurements were not conducted at the same time. The datasets were evaluated by using mean and fluctuation velocities. Turbulence parameters were calculated and compared as far as applicable. Uncertainties coming from non-steady flow were estimated with the normalized mean square error und evaluated by comparing long-term measurements of 60 minutes to shorter measurement intervals. Uncertainties coming from inhomogeneous flow were evaluated by comparison of ADCP with ADV data along the ADCP beams where ADCP data were effectively measured and in the vertical under the ADCP probe where velocities of the ADCP measurements were displayed. Errors coming from non-steady flow could be compensated through sufficiently long measurement intervals with high enough sampling rates depending on the

  20. Sources of uncertainty in flood inundation maps

    Science.gov (United States)

    Bales, J.D.; Wagner, C.R.

    2009-01-01

    Flood inundation maps typically have been used to depict inundated areas for floods having specific exceedance levels. The uncertainty associated with the inundation boundaries is seldom quantified, in part, because all of the sources of uncertainty are not recognized and because data available to quantify uncertainty seldom are available. Sources of uncertainty discussed in this paper include hydrologic data used for hydraulic model development and validation, topographic data, and the hydraulic model. The assumption of steady flow, which typically is made to produce inundation maps, has less of an effect on predicted inundation at lower flows than for higher flows because more time typically is required to inundate areas at high flows than at low flows. Difficulties with establishing reasonable cross sections that do not intersect and that represent water-surface slopes in tributaries contribute additional uncertainties in the hydraulic modelling. As a result, uncertainty in the flood inundation polygons simulated with a one-dimensional model increases with distance from the main channel.

  1. Quantifying uncertainty on sediment loads using bootstrap confidence intervals

    Science.gov (United States)

    Slaets, Johanna I. F.; Piepho, Hans-Peter; Schmitter, Petra; Hilger, Thomas; Cadisch, Georg

    2017-01-01

    Load estimates are more informative than constituent concentrations alone, as they allow quantification of on- and off-site impacts of environmental processes concerning pollutants, nutrients and sediment, such as soil fertility loss, reservoir sedimentation and irrigation channel siltation. While statistical models used to predict constituent concentrations have been developed considerably over the last few years, measures of uncertainty on constituent loads are rarely reported. Loads are the product of two predictions, constituent concentration and discharge, integrated over a time period, which does not make it straightforward to produce a standard error or a confidence interval. In this paper, a linear mixed model is used to estimate sediment concentrations. A bootstrap method is then developed that accounts for the uncertainty in the concentration and discharge predictions, allowing temporal correlation in the constituent data, and can be used when data transformations are required. The method was tested for a small watershed in Northwest Vietnam for the period 2010-2011. The results showed that confidence intervals were asymmetric, with the highest uncertainty in the upper limit, and that a load of 6262 Mg year-1 had a 95 % confidence interval of (4331, 12 267) in 2010 and a load of 5543 Mg an interval of (3593, 8975) in 2011. Additionally, the approach demonstrated that direct estimates from the data were biased downwards compared to bootstrap median estimates. These results imply that constituent loads predicted from regression-type water quality models could frequently be underestimating sediment yields and their environmental impact.

  2. How to quantify uncertainty and variability in life cycle assessment: the case of greenhouse gas emissions of gas power generation in the US

    Science.gov (United States)

    Hauck, M.; Steinmann, Z. J. N.; Laurenzi, I. J.; Karuppiah, R.; Huijbregts, M. A. J.

    2014-07-01

    This study quantified the contributions of uncertainty and variability to the range of life-cycle greenhouse gas (LCGHG) emissions associated with conventional gas-fired electricity generation in the US. Whereas uncertainty is defined as lack of knowledge and can potentially be reduced by additional research, variability is an inherent characteristic of supply chains and cannot be reduced without physically modifying the system. The life-cycle included four stages: production, processing, transmission and power generation, and utilized a functional unit of 1 kWh of electricity generated at plant. Technological variability requires analyses of life cycles of individual power plants, e.g. combined cycle plants or boilers. Parameter uncertainty was modeled via Monte Carlo simulation. Our approach reveals that technological differences are the predominant cause for the range of LCGHG emissions associated with gas power, primarily due to variability in plant efficiencies. Uncertainties in model parameters played a minor role for 100 year time horizon. Variability in LCGHG emissions was a factor of 1.4 for combined cycle plants, and a factor of 1.3 for simple cycle plants (95% CI, 100 year horizon). The results can be used to assist decision-makers in assessing factors that contribute to LCGHG emissions despite uncertainties in parameters employed to estimate those emissions.

  3. How to quantify uncertainty and variability in life cycle assessment: the case of greenhouse gas emissions of gas power generation in the US

    International Nuclear Information System (INIS)

    Hauck, M; Steinmann, Z J N; Huijbregts, M A J; Laurenzi, I J; Karuppiah, R

    2014-01-01

    This study quantified the contributions of uncertainty and variability to the range of life-cycle greenhouse gas (LCGHG) emissions associated with conventional gas-fired electricity generation in the US. Whereas uncertainty is defined as lack of knowledge and can potentially be reduced by additional research, variability is an inherent characteristic of supply chains and cannot be reduced without physically modifying the system. The life-cycle included four stages: production, processing, transmission and power generation, and utilized a functional unit of 1 kWh of electricity generated at plant. Technological variability requires analyses of life cycles of individual power plants, e.g. combined cycle plants or boilers. Parameter uncertainty was modeled via Monte Carlo simulation. Our approach reveals that technological differences are the predominant cause for the range of LCGHG emissions associated with gas power, primarily due to variability in plant efficiencies. Uncertainties in model parameters played a minor role for 100 year time horizon. Variability in LCGHG emissions was a factor of 1.4 for combined cycle plants, and a factor of 1.3 for simple cycle plants (95% CI, 100 year horizon). The results can be used to assist decision-makers in assessing factors that contribute to LCGHG emissions despite uncertainties in parameters employed to estimate those emissions. (letter)

  4. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  5. Uncertainty budget for k0-NAA

    International Nuclear Information System (INIS)

    Robouch, P.; Arana, G.; Eguskiza, M.; Etxebarria, N.

    2000-01-01

    The concepts of the Guide to the expression of Uncertainties in Measurements for chemical measurements (GUM) and the recommendations of the Eurachem document 'Quantifying Uncertainty in Analytical Methods' are applied to set up the uncertainty budget for k 0 -NAA. The 'universally applicable spreadsheet technique', described by KRAGTEN, is applied to the k 0 -NAA basic equations for the computation of uncertainties. The variance components - individual standard uncertainties - highlight the contribution and the importance of the different parameters to be taken into account. (author)

  6. An uncertainty-based framework to quantifying climate change impacts on coastal flood vulnerability: case study of New York City.

    Science.gov (United States)

    Zahmatkesh, Zahra; Karamouz, Mohammad

    2017-10-17

    The continued development efforts around the world, growing population, and the increased probability of occurrence of extreme hydrologic events have adversely affected natural and built environments. Flood damages and loss of lives from the devastating storms, such as Irene and Sandy on the East Coast of the USA, are examples of the vulnerability to flooding that even developed countries have to face. The odds of coastal flooding disasters have been increased due to accelerated sea level rise, climate change impacts, and communities' interest to live near the coastlines. Climate change, for instance, is becoming a major threat to sustainable development because of its adverse impacts on the hydrologic cycle. Effective management strategies are thus required for flood vulnerability reduction and disaster preparedness. This paper is an extension to the flood resilience studies in the New York City coastal watershed. Here, a framework is proposed to quantify coastal flood vulnerability while accounting for climate change impacts. To do so, a multi-criteria decision making (MCDM) approach that combines watershed characteristics (factors) and their weights is proposed to quantify flood vulnerability. Among the watershed characteristics, potential variation in the hydrologic factors under climate change impacts is modeled utilizing the general circulation models' (GCMs) outputs. The considered factors include rainfall, extreme water level, and sea level rise that exacerbate flood vulnerability through increasing exposure and susceptibility to flooding. Uncertainty in the weights as well as values of factors is incorporated in the analysis using the Monte Carlo (MC) sampling method by selecting the best-fitted distributions to the parameters with random nature. A number of low impact development (LID) measures are then proposed to improve watershed adaptive capacity to deal with coastal flooding. Potential range of current and future vulnerability to flooding is

  7. Three-dimensional shape analysis of coarse aggregates: New techniques for and preliminary results on several different coarse aggregates and reference rocks

    International Nuclear Information System (INIS)

    Erdogan, S.T.; Quiroga, P.N.; Fowler, D.W.; Saleh, H.A.; Livingston, R.A.; Garboczi, E.J.; Ketcham, P.M.; Hagedorn, J.G.; Satterfield, S.G.

    2006-01-01

    The shape of aggregates used in concrete is an important parameter that helps determine many concrete properties, especially the rheology of fresh concrete and early-age mechanical properties. This paper discusses the sample preparation and image analysis techniques necessary for obtaining an aggregate particle image in 3-D, using X-ray computed tomography, which is then suitable for spherical harmonic analysis. The shapes of three reference rocks are analyzed for uncertainty determination via direct comparison to the geometry of their reconstructed images. A Virtual Reality Modeling Language technique is demonstrated that can give quick and accurate 3-D views of aggregates. Shape data on several different kinds of coarse aggregates are compared and used to illustrate potential mathematical shape analyses made possible by the spherical harmonic information

  8. Quantifying Uncertainty in Near Surface Electromagnetic Imaging Using Bayesian Methods

    Science.gov (United States)

    Blatter, D. B.; Ray, A.; Key, K.

    2017-12-01

    Geoscientists commonly use electromagnetic methods to image the Earth's near surface. Field measurements of EM fields are made (often with the aid an artificial EM source) and then used to infer near surface electrical conductivity via a process known as inversion. In geophysics, the standard inversion tool kit is robust and can provide an estimate of the Earth's near surface conductivity that is both geologically reasonable and compatible with the measured field data. However, standard inverse methods struggle to provide a sense of the uncertainty in the estimate they provide. This is because the task of finding an Earth model that explains the data to within measurement error is non-unique - that is, there are many, many such models; but the standard methods provide only one "answer." An alternative method, known as Bayesian inversion, seeks to explore the full range of Earth model parameters that can adequately explain the measured data, rather than attempting to find a single, "ideal" model. Bayesian inverse methods can therefore provide a quantitative assessment of the uncertainty inherent in trying to infer near surface conductivity from noisy, measured field data. This study applies a Bayesian inverse method (called trans-dimensional Markov chain Monte Carlo) to transient airborne EM data previously collected over Taylor Valley - one of the McMurdo Dry Valleys in Antarctica. Our results confirm the reasonableness of previous estimates (made using standard methods) of near surface conductivity beneath Taylor Valley. In addition, we demonstrate quantitatively the uncertainty associated with those estimates. We demonstrate that Bayesian inverse methods can provide quantitative uncertainty to estimates of near surface conductivity.

  9. Advanced probabilistic methods for quantifying the effects of various uncertainties in structural response

    Science.gov (United States)

    Nagpal, Vinod K.

    1988-01-01

    The effects of actual variations, also called uncertainties, in geometry and material properties on the structural response of a space shuttle main engine turbopump blade are evaluated. A normal distribution was assumed to represent the uncertainties statistically. Uncertainties were assumed to be totally random, partially correlated, and fully correlated. The magnitude of these uncertainties were represented in terms of mean and variance. Blade responses, recorded in terms of displacements, natural frequencies, and maximum stress, was evaluated and plotted in the form of probabilistic distributions under combined uncertainties. These distributions provide an estimate of the range of magnitudes of the response and probability of occurrence of a given response. Most importantly, these distributions provide the information needed to estimate quantitatively the risk in a structural design.

  10. Marketable pollution permits with uncertainty and transaction costs

    International Nuclear Information System (INIS)

    Montero, Juan-Pablo

    1998-01-01

    Increasing interest in the use of marketable permits for pollution control has become evident in recent years. Concern regarding their performance still remains because empirical evidence has shown transaction costs and uncertainty to be significant in past and existing marketable permits programs. In this paper we develop theoretical and numerical models that include transaction costs and uncertainty (in trade approval) to show their effects on market performance (i.e., equilibrium price of permits and trading volume) and aggregate control costs. We also show that in the presence of transaction costs and uncertainty the initial allocation of permits may not be neutral in terms of efficiency. Furthermore, using a numerical model for a hypothetical NO x trading program in which participants have discrete control technology choices, we find that aggregate control costs and the equilibrium price of permits are sensitive to the initial allocation of permits, even for constant marginal transaction costs and certainty

  11. Suspensions of colloidal particles and aggregates

    CERN Document Server

    Babick, Frank

    2016-01-01

    This book addresses the properties of particles in colloidal suspensions. It has a focus on particle aggregates and the dependency of their physical behaviour on morphological parameters. For this purpose, relevant theories and methodological tools are reviewed and applied to selected examples. The book is divided into four main chapters. The first of them introduces important measurement techniques for the determination of particle size and interfacial properties in colloidal suspensions. A further chapter is devoted to the physico-chemical properties of colloidal particles—highlighting the interfacial phenomena and the corresponding interactions between particles. The book’s central chapter examines the structure-property relations of colloidal aggregates. This comprises concepts to quantify size and structure of aggregates, models and numerical tools for calculating the (light) scattering and hydrodynamic properties of aggregates, and a discussion on van-der-Waals and double layer interactions between ...

  12. Piecewise Polynomial Aggregation as Preprocessing for Data Numerical Modeling

    Science.gov (United States)

    Dobronets, B. S.; Popova, O. A.

    2018-05-01

    Data aggregation issues for numerical modeling are reviewed in the present study. The authors discuss data aggregation procedures as preprocessing for subsequent numerical modeling. To calculate the data aggregation, the authors propose using numerical probabilistic analysis (NPA). An important feature of this study is how the authors represent the aggregated data. The study shows that the offered approach to data aggregation can be interpreted as the frequency distribution of a variable. To study its properties, the density function is used. For this purpose, the authors propose using the piecewise polynomial models. A suitable example of such approach is the spline. The authors show that their approach to data aggregation allows reducing the level of data uncertainty and significantly increasing the efficiency of numerical calculations. To demonstrate the degree of the correspondence of the proposed methods to reality, the authors developed a theoretical framework and considered numerical examples devoted to time series aggregation.

  13. Quantifying uncertainties of climate signals related to the 11-year solar cycle

    Science.gov (United States)

    Kruschke, T.; Kunze, M.; Matthes, K. B.; Langematz, U.; Wahl, S.

    2017-12-01

    Although state-of-the-art reconstructions based on proxies and (semi-)empirical models converge in terms of total solar irradiance, they still significantly differ in terms of spectral solar irradiance (SSI) with respect to the mean spectral distribution of energy input and temporal variability. This study aims at quantifying uncertainties for the Earth's climate related to the 11-year solar cycle by forcing two chemistry-climate models (CCMs) - CESM1(WACCM) and EMAC - with five different SSI reconstructions (NRLSSI1, NRLSSI2, SATIRE-T, SATIRE-S, CMIP6-SSI) and the reference spectrum RSSV1-ATLAS3, derived from observations. We conduct a unique set of timeslice experiments. External forcings and boundary conditions are fixed and identical for all experiments, except for the solar forcing. The set of analyzed simulations consists of one solar minimum simulation, employing RSSV1-ATLAS3 and five solar maximum experiments. The latter are a result of adding the amplitude of solar cycle 22 according to the five reconstructions to RSSV1-ATLAS3. Our results show that the climate response to the 11y solar cycle is generally robust across CCMs and SSI forcings. However, analyzing the variance of the solar maximum ensemble by means of ANOVA-statistics reveals additional information on the uncertainties of the mean climate signals. The annual mean response agrees very well between the two CCMs for most parts of the lower and middle atmosphere. Only the upper mesosphere is subject to significant differences related to the choice of the model. However, the different SSI forcings lead to significant differences in ozone concentrations, shortwave heating rates, and temperature throughout large parts of the mesosphere and upper stratosphere. Regarding the seasonal evolution of the climate signals, our findings for short wave heating rates, and temperature are similar to the annual means with respect to the relative importance of the choice of the model or the SSI forcing for the

  14. Some illustrative examples of model uncertainty

    International Nuclear Information System (INIS)

    Bier, V.M.

    1994-01-01

    In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion

  15. Aggregated wind power generation probabilistic forecasting based on particle filter

    International Nuclear Information System (INIS)

    Li, Pai; Guan, Xiaohong; Wu, Jiang

    2015-01-01

    Highlights: • A new method for probabilistic forecasting of aggregated wind power generation. • A dynamic system is established based on a numerical weather prediction model. • The new method handles the non-Gaussian and time-varying wind power uncertainties. • Particle filter is applied to forecast predictive densities of wind generation. - Abstract: Probability distribution of aggregated wind power generation in a region is one of important issues for power system daily operation. This paper presents a novel method to forecast the predictive densities of the aggregated wind power generation from several geographically distributed wind farms, considering the non-Gaussian and non-stationary characteristics in wind power uncertainties. Based on a mesoscale numerical weather prediction model, a dynamic system is established to formulate the relationship between the atmospheric and near-surface wind fields of geographically distributed wind farms. A recursively backtracking framework based on the particle filter is applied to estimate the atmospheric state with the near-surface wind power generation measurements, and to forecast the possible samples of the aggregated wind power generation. The predictive densities of the aggregated wind power generation are then estimated based on these predicted samples by a kernel density estimator. In case studies, the new method presented is tested on a 9 wind farms system in Midwestern United States. The testing results that the new method can provide competitive interval forecasts for the aggregated wind power generation with conventional statistical based models, which validates the effectiveness of the new method

  16. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  17. Single particle detection and characterization of synuclein co-aggregation

    International Nuclear Information System (INIS)

    Giese, Armin; Bader, Benedikt; Bieschke, Jan; Schaffar, Gregor; Odoy, Sabine; Kahle, Philipp J.; Haass, Christian; Kretzschmar, Hans

    2005-01-01

    Protein aggregation is the key event in a number of human diseases such as Alzheimer's and Parkinson's disease. We present a general method to quantify and characterize protein aggregates by dual-colour scanning for intensely fluorescent targets (SIFT). In addition to high sensitivity, this approach offers a unique opportunity to study co-aggregation processes. As the ratio of two fluorescently labelled components can be analysed for each aggregate separately in a homogeneous assay, the molecular composition of aggregates can be studied even in samples containing a mixture of different types of aggregates. Using this method, we could show that wild-type α-synuclein forms co-aggregates with a mutant variant found in familial Parkinson's disease. Moreover, we found a striking increase in aggregate formation at non-equimolar mixing ratios, which may have important therapeutic implications, as lowering the relative amount of aberrant protein may cause an increase of protein aggregation leading to adverse effects

  18. On quantifying uncertainty for project selection: the case of renewable energy sources' investment

    International Nuclear Information System (INIS)

    Kirytopoulos, Konstantinos; Rentizelas, Athanassios; Tziralis, Georgios

    2006-01-01

    The selection of a project among different alternatives, considering the limited resources of a company (organisation), is an added value process that determines the prosperity of an undertaken project (investment). This applies also to the 'boming' Renewable Energy Sector, especially under the circumstances established by the recent activation of the Kyoto protocal and by the plethora of available choices for renewable energy sources (RES) projjects. The need for a reliable project selection method among the various alternatives is, therefore, highlighted and, in this context, the paper proposes the NPV function as one of possible criteria for the selection of a RES project. Furthermore, it differentiates from the typical NPV calculation process by adding the concept of a probabilistic NPV approach through Monte Carlo simulation. Reality is non-deterministic, so any attempt of modelling it by using a deterministic approach is by definition erroneous. The paper ultimately proposes a process of substituting the point with a range estimation, capable of quantifying the various uncertainty factors and in this way elucidate the accomplishment possibilities of eligible scenarious. The paper is enhanced by case study showing how the proposed method can be practically applied to support the investment decision, thus enabling the decision makers to judge its effectiveness and usefulness.(Author)

  19. Quantifying interspecific coagulation efficiency of phytoplankton

    DEFF Research Database (Denmark)

    Hansen, J.L.S.; Kiørboe, Thomas

    1997-01-01

    . nordenskjoeldii. Mutual coagulation between Skeletonema costatum and the non-sticky cel:ls of Ditylum brightwellii also proceeded with hall the efficiency of S. costatum alone. The latex beads were suitable to be used as 'standard particles' to quantify the ability of phytoplankton to prime aggregation...

  20. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to keff sensitivity data, cross-section uncertainty data, how keff sensitivity data and keff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  1. Uncertainty and sampling issues in tank characterization

    International Nuclear Information System (INIS)

    Liebetrau, A.M.; Pulsipher, B.A.; Kashporenko, D.M.

    1997-06-01

    A defensible characterization strategy must recognize that uncertainties are inherent in any measurement or estimate of interest and must employ statistical methods for quantifying and managing those uncertainties. Estimates of risk and therefore key decisions must incorporate knowledge about uncertainty. This report focuses statistical methods that should be employed to ensure confident decision making and appropriate management of uncertainty. Sampling is a major source of uncertainty that deserves special consideration in the tank characterization strategy. The question of whether sampling will ever provide the reliable information needed to resolve safety issues is explored. The issue of sample representativeness must be resolved before sample information is reliable. Representativeness is a relative term but can be defined in terms of bias and precision. Currently, precision can be quantified and managed through an effective sampling and statistical analysis program. Quantifying bias is more difficult and is not being addressed under the current sampling strategies. Bias could be bounded by (1) employing new sampling methods that can obtain samples from other areas in the tanks, (2) putting in new risers on some worst case tanks and comparing the results from existing risers with new risers, or (3) sampling tanks through risers under which no disturbance or activity has previously occurred. With some bound on bias and estimates of precision, various sampling strategies could be determined and shown to be either cost-effective or infeasible

  2. Optimal trading of plug-in electric vehicle aggregation agents in a market environment for sustainability

    International Nuclear Information System (INIS)

    Shafie-khah, M.; Heydarian-Forushani, E.; Golshan, M.E.H.; Siano, P.; Moghaddam, M.P.; Sheikh-El-Eslami, M.K.; Catalão, J.P.S.

    2016-01-01

    Highlights: • Proposing a multi-stage stochastic model of a PEV aggregation agent. • Reflecting several uncertainties using a stochastic model and appropriate scenarios. • Updating bids/offers of PEV aggregation agents by taking part in the intraday market. • Taking part in Demand Response eXchange (DRX) markets. - Abstract: Ever since energy sustainability is an emergent concern, Plug-in Electric Vehicles (PEVs) significantly affect the approaching smart grids. Indeed, Demand Response (DR) brings a positive effect on the uncertainties of renewable energy sources, improving market efficiency and enhancing system reliability. This paper proposes a multi-stage stochastic model of a PEV aggregation agent to participate in day-ahead and intraday electricity markets. The stochastic model reflects several uncertainties such as the behaviour of PEV owners, electricity market prices, and activated quantity of reserve by the system operator. For this purpose, appropriate scenarios are utilized to realize the uncertain feature of the problem. Furthermore, in the proposed model, the PEV aggregation agents can update their bids/offers by taking part in the intraday market. To this end, these aggregation agents take part in Demand Response eXchange (DRX) markets designed in the intraday session by employing DR resources. The numerical results show that DR provides a perfect opportunity for PEV aggregation agents to increase the profit. In addition, the results reveal that the PEV aggregation agent not only can increase its profit by participating in the DRX market, but also can become an important player in the mentioned market.

  3. The Stock Market: Risk vs. Uncertainty.

    Science.gov (United States)

    Griffitts, Dawn

    2002-01-01

    This economics education publication focuses on the U.S. stock market and the risk and uncertainty that an individual faces when investing in the market. The material explains that risk and uncertainty relate to the same underlying concept randomness. It defines and discusses both concepts and notes that although risk is quantifiable, uncertainty…

  4. An inexact fuzzy two-stage stochastic model for quantifying the efficiency of nonpoint source effluent trading under uncertainty

    International Nuclear Information System (INIS)

    Luo, B.; Maqsood, I.; Huang, G.H.; Yin, Y.Y.; Han, D.J.

    2005-01-01

    Reduction of nonpoint source (NPS) pollution from agricultural lands is a major concern in most countries. One method to reduce NPS pollution is through land retirement programs. This method, however, may result in enormous economic costs especially when large sums of croplands need to be retired. To reduce the cost, effluent trading can be employed to couple with land retirement programs. However, the trading efforts can also become inefficient due to various uncertainties existing in stochastic, interval, and fuzzy formats in agricultural systems. Thus, it is desired to develop improved methods to effectively quantify the efficiency of potential trading efforts by considering those uncertainties. In this respect, this paper presents an inexact fuzzy two-stage stochastic programming model to tackle such problems. The proposed model can facilitate decision-making to implement trading efforts for agricultural NPS pollution reduction through land retirement programs. The applicability of the model is demonstrated through a hypothetical effluent trading program within a subcatchment of the Lake Tai Basin in China. The study results indicate that the efficiency of the trading program is significantly influenced by precipitation amount, agricultural activities, and level of discharge limits of pollutants. The results also show that the trading program will be more effective for low precipitation years and with stricter discharge limits

  5. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    International Nuclear Information System (INIS)

    Wagner, Ryan; Raman, Arvind; Moon, Robert; Pratt, Jon; Shaw, Gordon

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7–20 GPa. A key result is that multiple replicates of force–distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials.

  6. Uncertainty estimation and risk prediction in air quality

    International Nuclear Information System (INIS)

    Garaud, Damien

    2011-01-01

    This work is about uncertainty estimation and risk prediction in air quality. Firstly, we build a multi-model ensemble of air quality simulations which can take into account all uncertainty sources related to air quality modeling. Ensembles of photochemical simulations at continental and regional scales are automatically generated. Then, these ensemble are calibrated with a combinatorial optimization method. It selects a sub-ensemble which is representative of uncertainty or shows good resolution and reliability for probabilistic forecasting. This work shows that it is possible to estimate and forecast uncertainty fields related to ozone and nitrogen dioxide concentrations or to improve the reliability of threshold exceedance predictions. The approach is compared with Monte Carlo simulations, calibrated or not. The Monte Carlo approach appears to be less representative of the uncertainties than the multi-model approach. Finally, we quantify the observational error, the representativeness error and the modeling errors. The work is applied to the impact of thermal power plants, in order to quantify the uncertainty on the impact estimates. (author) [fr

  7. A model for bacterial colonization of sinking aggregates.

    Science.gov (United States)

    Bearon, R N

    2007-01-01

    Sinking aggregates provide important nutrient-rich environments for marine bacteria. Quantifying the rate at which motile bacteria colonize such aggregations is important in understanding the microbial loop in the pelagic food web. In this paper, a simple analytical model is presented to predict the rate at which bacteria undergoing a random walk encounter a sinking aggregate. The model incorporates the flow field generated by the sinking aggregate, the swimming behavior of the bacteria, and the interaction of the flow with the swimming behavior. An expression for the encounter rate is computed in the limit of large Péclet number when the random walk can be approximated by a diffusion process. Comparison with an individual-based numerical simulation is also given.

  8. Quantifying remarks to the question of uncertainties of the 'general dose assessment fundamentals'

    International Nuclear Information System (INIS)

    Brenk, H.D.; Vogt, K.J.

    1982-12-01

    Dose prediction models are always subject to uncertainties due to a number of factors including deficiencies in the model structure and uncertainties of the model input parameter values. In lieu of validation experiments the evaluation of these uncertainties is restricted to scientific judgement. Several attempts have been made in the literature to evaluate the uncertainties of the current dose assessment models resulting from uncertainties of the model input parameter values using stochastic approaches. Less attention, however, has been paid to potential sources of systematic over- and underestimations of the predicted doses due to deficiencies in the model structure. The present study addresses this aspect with regard to dose assessment models currently used for regulatory purposes. The influence of a number of basic simplifications and conservative assumptions has been investigated. Our systematic approach is exemplified by a comparison of doses evaluated on the basis of the regulatory guide model and a more realistic model respectively. This is done for 3 critical exposure pathways. As a result of this comparison it can be concluded that the currently used regularoty-type models include significant safety factors resulting in a systematic overprediction of dose to man up to two orders of magnitude. For this reason there are some indications that these models usually more than compensate the bulk of the stochastic uncertainties caused by the variability of the input parameter values. (orig.) [de

  9. Summary from the epistemic uncertainty workshop: consensus amid diversity

    International Nuclear Information System (INIS)

    Ferson, Scott; Joslyn, Cliff A.; Helton, Jon C.; Oberkampf, William L.; Sentz, Kari

    2004-01-01

    The 'Epistemic Uncertainty Workshop' sponsored by Sandia National Laboratories was held in Albuquerque, New Mexico, on 6-7 August 2002. The workshop was organized around a set of Challenge Problems involving both epistemic and aleatory uncertainty that the workshop participants were invited to solve and discuss. This concluding article in a special issue of Reliability Engineering and System Safety based on the workshop discusses the intent of the Challenge Problems, summarizes some discussions from the workshop, and provides a technical comparison among the papers in this special issue. The Challenge Problems were computationally simple models that were intended as vehicles for the illustration and comparison of conceptual and numerical techniques for use in analyses that involve: (i) epistemic uncertainty, (ii) aggregation of multiple characterizations of epistemic uncertainty, (iii) combination of epistemic and aleatory uncertainty, and (iv) models with repeated parameters. There was considerable diversity of opinion at the workshop about both methods and fundamental issues, and yet substantial consensus about what the answers to the problems were, and even about how each of the four issues should be addressed. Among the technical approaches advanced were probability theory, Dempster-Shafer evidence theory, random sets, sets of probability measures, imprecise coherent probabilities, coherent lower previsions, probability boxes, possibility theory, fuzzy sets, joint distribution tableaux, polynomial chaos expansions, and info-gap models. Although some participants maintained that a purely probabilistic approach is fully capable of accounting for all forms of uncertainty, most agreed that the treatment of epistemic uncertainty introduces important considerations and that the issues underlying the Challenge Problems are legitimate and significant. Topics identified as meriting additional research include elicitation of uncertainty representations, aggregation of

  10. Aggregate surface areas quantified through laser measurements for South African asphalt mixtures

    CSIR Research Space (South Africa)

    Anochie-Boateng, Joseph

    2012-02-01

    Full Text Available design. This paper introduces the use of a three-dimensional (3D) laser scanning method to directly measure the surface area of aggregates used in road pavements in South Africa. As an application of the laser-based measurements, the asphalt film...

  11. Treatment of uncertainty in low-level waste performance assessment

    International Nuclear Information System (INIS)

    Kozak, M.W.; Olague, N.E.; Gallegos, D.P.; Rao, R.R.

    1991-01-01

    Uncertainties arise from a number of different sources in low-level waste performance assessment. In this paper the types of uncertainty are reviewed, and existing methods for quantifying and reducing each type of uncertainty are discussed. These approaches are examined in the context of the current low-level radioactive waste regulatory performance objectives, which are deterministic. The types of uncertainty discussed in this paper are model uncertainty, uncertainty about future conditions, and parameter uncertainty. The advantages and disadvantages of available methods for addressing uncertainty in low-level waste performance assessment are presented. 25 refs

  12. Uncertainty management in Real Estate Development: Studying the potential of SCRUM design methodology

    NARCIS (Netherlands)

    Blokpoel, S.B.; Reymen, Isabelle; Dewulf, Geert P.M.R.; Sariyildiz, S.; Tuncer, B.

    2005-01-01

    Real estate development is all about assessing and controlling risks and uncertainties. Risk management implies making decisions based on quantified risks to execute riskresponse measures. Uncertainties, on the other hand, cannot be quantified and are therefore unpredictable. In literature, much

  13. How should epistemic uncertainty in modelling water resources management problems shape evaluations of their operations?

    Science.gov (United States)

    Dobson, B.; Pianosi, F.; Reed, P. M.; Wagener, T.

    2017-12-01

    In previous work, we have found that water supply companies are typically hesitant to use reservoir operation tools to inform their release decisions. We believe that this is, in part, due to a lack of faith in the fidelity of the optimization exercise with regards to its ability to represent the real world. In an attempt to quantify this, recent literature has studied the impact on performance from uncertainty arising in: forcing (e.g. reservoir inflows), parameters (e.g. parameters for the estimation of evaporation rate) and objectives (e.g. worst first percentile or worst case). We suggest that there is also epistemic uncertainty in the choices made during model creation, for example in the formulation of an evaporation model or aggregating regional storages. We create `rival framings' (a methodology originally developed to demonstrate the impact of uncertainty arising from alternate objective formulations), each with different modelling choices, and determine their performance impacts. We identify the Pareto approximate set of policies for several candidate formulations and then make them compete with one another in a large ensemble re-evaluation in each other's modelled spaces. This enables us to distinguish the impacts of different structural changes in the model used to evaluate system performance in an effort to generalize the validity of the optimized performance expectations.

  14. How to: understanding SWAT model uncertainty relative to measured results

    Science.gov (United States)

    Watershed models are being relied upon to contribute to most policy-making decisions of watershed management, and the demand for an accurate accounting of complete model uncertainty is rising. Generalized likelihood uncertainty estimation (GLUE) is a widely used method for quantifying uncertainty i...

  15. Uncertainty quantification using evidence theory in multidisciplinary design optimization

    International Nuclear Information System (INIS)

    Agarwal, Harish; Renaud, John E.; Preston, Evan L.; Padmanabhan, Dhanesh

    2004-01-01

    Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories. After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems

  16. Quantifying uncertainty in soot volume fraction estimates using Bayesian inference of auto-correlated laser-induced incandescence measurements

    Science.gov (United States)

    Hadwin, Paul J.; Sipkens, T. A.; Thomson, K. A.; Liu, F.; Daun, K. J.

    2016-01-01

    Auto-correlated laser-induced incandescence (AC-LII) infers the soot volume fraction (SVF) of soot particles by comparing the spectral incandescence from laser-energized particles to the pyrometrically inferred peak soot temperature. This calculation requires detailed knowledge of model parameters such as the absorption function of soot, which may vary with combustion chemistry, soot age, and the internal structure of the soot. This work presents a Bayesian methodology to quantify such uncertainties. This technique treats the additional "nuisance" model parameters, including the soot absorption function, as stochastic variables and incorporates the current state of knowledge of these parameters into the inference process through maximum entropy priors. While standard AC-LII analysis provides a point estimate of the SVF, Bayesian techniques infer the posterior probability density, which will allow scientists and engineers to better assess the reliability of AC-LII inferred SVFs in the context of environmental regulations and competing diagnostics.

  17. Uncertainties in life cycle assessment of waste management systems

    DEFF Research Database (Denmark)

    Clavreul, Julie; Christensen, Thomas Højlund

    2011-01-01

    Life cycle assessment has been used to assess environmental performances of waste management systems in many studies. The uncertainties inherent to its results are often pointed out but not always quantified, which should be the case to ensure a good decisionmaking process. This paper proposes...... a method to assess all parameter uncertainties and quantify the overall uncertainty of the assessment. The method is exemplified in a case study, where the goal is to determine if anaerobic digestion of organic waste is more beneficial than incineration in Denmark, considering only the impact on global...... warming. The sensitivity analysis pointed out ten parameters particularly highly influencing the result of the study. In the uncertainty analysis, the distributions of these ten parameters were used in a Monte Carlo analysis, which concluded that incineration appeared more favourable than anaerobic...

  18. Quantifying human behavior uncertainties in a coupled agent-based model for water resources management

    Science.gov (United States)

    Hyun, J. Y.; Yang, Y. C. E.; Tidwell, V. C.; Macknick, J.

    2017-12-01

    Modeling human behaviors and decisions in water resources management is a challenging issue due to its complexity and uncertain characteristics that affected by both internal (such as stakeholder's beliefs on any external information) and external factors (such as future policies and weather/climate forecast). Stakeholders' decision regarding how much water they need is usually not entirely rational in the real-world cases, so it is not quite suitable to model their decisions with a centralized (top-down) approach that assume everyone in a watershed follow the same order or pursue the same objective. Agent-based modeling (ABM) uses a decentralized approach (bottom-up) that allow each stakeholder to make his/her own decision based on his/her own objective and the belief of information acquired. In this study, we develop an ABM which incorporates the psychological human decision process by the theory of risk perception. The theory of risk perception quantifies human behaviors and decisions uncertainties using two sequential methodologies: the Bayesian Inference and the Cost-Loss Problem. The developed ABM is coupled with a regulation-based water system model: Riverware (RW) to evaluate different human decision uncertainties in water resources management. The San Juan River Basin in New Mexico (Figure 1) is chosen as a case study area, while we define 19 major irrigation districts as water use agents and their primary decision is to decide the irrigated area on an annual basis. This decision will be affected by three external factors: 1) upstream precipitation forecast (potential amount of water availability), 2) violation of the downstream minimum flow (required to support ecosystems), and 3) enforcement of a shortage sharing plan (a policy that is currently undertaken in the region for drought years). Three beliefs (as internal factors) that correspond to these three external factors will also be considered in the modeling framework. The objective of this study is

  19. Evaluation of uncertainty associated with parameters for long-term safety assessments of geological disposal

    International Nuclear Information System (INIS)

    Yamaguchi, Tetsuji; Minase, Naofumi; Iida, Yoshihisa; Tanaka, Tadao; Nakayama, Shinichi

    2005-01-01

    This paper describes the current status of our data acquisition on quantifying uncertainties associated with parameters for safety assessment on groundwater scenarios for geological disposal of radioactive wastes. First, sources of uncertainties and the resulting priority in data acquisition were briefed. Then, the current status of data acquisition for quantifying the uncertainties in assessing solubility, diffusivity in bentonite buffer and distribution coefficient on rocks is introduced. The uncertainty with the solubility estimation is quantified from that associated with thermodynamic data and that in estimating groundwater chemistry. The uncertainty associated with the diffusivity in bentonite buffer is composed of variations of relevant factors such as porosity of the bentonite buffer, montmorillonite content, chemical composition of pore water and temperature. The uncertainty of factors such as the specific surface area of the rock, pH, ionic strength, carbonate concentration in groundwater compose uncertainty of the distribution coefficient of radionuclides on rocks. Based on these investigations, problems to be solved in future studies are summarized. (author)

  20. Hail formation triggers rapid ash aggregation in volcanic plumes.

    Science.gov (United States)

    Van Eaton, Alexa R; Mastin, Larry G; Herzog, Michael; Schwaiger, Hans F; Schneider, David J; Wallace, Kristi L; Clarke, Amanda B

    2015-08-03

    During explosive eruptions, airborne particles collide and stick together, accelerating the fallout of volcanic ash and climate-forcing aerosols. This aggregation process remains a major source of uncertainty both in ash dispersal forecasting and interpretation of eruptions from the geological record. Here we illuminate the mechanisms and timescales of particle aggregation from a well-characterized 'wet' eruption. The 2009 eruption of Redoubt Volcano, Alaska, incorporated water from the surface (in this case, a glacier), which is a common occurrence during explosive volcanism worldwide. Observations from C-band weather radar, fall deposits and numerical modelling demonstrate that hail-forming processes in the eruption plume triggered aggregation of ∼95% of the fine ash and stripped much of the erupted mass out of the atmosphere within 30 min. Based on these findings, we propose a mechanism of hail-like ash aggregation that contributes to the anomalously rapid fallout of fine ash and occurrence of concentrically layered aggregates in volcanic deposits.

  1. Quantifying Uncertainty in Estimation of Potential Recharge in Tropical and Temperate Catchments using a Crop Model and Microwave Remote Sensing

    Science.gov (United States)

    Krishnan Kutty, S.; Sekhar, M.; Ruiz, L.; Tomer, S. K.; Bandyopadhyay, S.; Buis, S.; Guerif, M.; Gascuel-odoux, C.

    2012-12-01

    Groundwater recharge in a semi-arid region is generally low, but could exhibit high spatial variability depending on the soil type and plant cover. The potential recharge (the drainage flux just beneath the root zone) is found to be sensitive to water holding capacity and rooting depth (Rushton, 2003). Simple water balance approaches for recharge estimation often fail to consider the effect of plant cover, growth phases and rooting depth. Hence a crop model based approach might be better suited to assess sensitivity of recharge for various crop-soil combinations in agricultural catchments. Martinez et al. (2009) using a root zone modelling approach to estimate groundwater recharge stressed that future studies should focus on quantifying the uncertainty in recharge estimates due to uncertainty in soil water parameters such as soil layers, field capacity, rooting depth etc. Uncertainty in the parameters may arise due to the uncertainties in retrieved variables (surface soil moisture and leaf area index) from satellite. Hence a good estimate of parameters as well as their uncertainty is essential for a reliable estimate of the potential recharge. In this study we focus on assessing the sensitivity of crop and soil types on the potential recharge by using a generic crop model STICS. The effect of uncertainty in the soil parameters on the estimates of recharge and its uncertainty is investigated. The multi-layer soil water parameters and their uncertainty is estimated by inversion of STICS model using the GLUE approach. Surface soil moisture and LAI either retrieved from microwave remote sensing data or measured in field plots (Sreelash et al., 2012) were found to provide good estimates of the soil water properties and therefore both these data sets were used in this study to estimate the parameters and the potential recharge for a combination of soil-crop systems. These investigations were made in two field experimental catchments. The first one is in the tropical semi

  2. Quantifying Uncertainty in Instantaneous Orbital Data Products of TRMM over Indian Subcontinent

    Science.gov (United States)

    Jayaluxmi, I.; Nagesh, D.

    2013-12-01

    In the last 20 years, microwave radiometers have taken satellite images of earth's weather proving to be a valuable tool for quantitative estimation of precipitation from space. However, along with the widespread acceptance of microwave based precipitation products, it has also been recognized that they contain large uncertainties. While most of the uncertainty evaluation studies focus on the accuracy of rainfall accumulated over time (e.g., season/year), evaluation of instantaneous rainfall intensities from satellite orbital data products are relatively rare. These instantaneous products are known to potentially cause large uncertainties during real time flood forecasting studies at the watershed scale. Especially over land regions, where the highly varying land surface emissivity offer a myriad of complications hindering accurate rainfall estimation. The error components of orbital data products also tend to interact nonlinearly with hydrologic modeling uncertainty. Keeping these in mind, the present study fosters the development of uncertainty analysis using instantaneous satellite orbital data products (version 7 of 1B11, 2A25, 2A23) derived from the passive and active sensors onboard Tropical Rainfall Measuring Mission (TRMM) satellite, namely TRMM microwave imager (TMI) and Precipitation Radar (PR). The study utilizes 11 years of orbital data from 2002 to 2012 over the Indian subcontinent and examines the influence of various error sources on the convective and stratiform precipitation types. Analysis conducted over the land regions of India investigates three sources of uncertainty in detail. These include 1) Errors due to improper delineation of rainfall signature within microwave footprint (rain/no rain classification), 2) Uncertainty offered by the transfer function linking rainfall with TMI low frequency channels and 3) Sampling errors owing to the narrow swath and infrequent visits of TRMM sensors. Case study results obtained during the Indian summer

  3. Uncertainty in hydrological signatures for gauged and ungauged catchments

    Science.gov (United States)

    Westerberg, Ida K.; Wagener, Thorsten; Coxon, Gemma; McMillan, Hilary K.; Castellarin, Attilio; Montanari, Alberto; Freer, Jim

    2016-03-01

    Reliable information about hydrological behavior is needed for water-resource management and scientific investigations. Hydrological signatures quantify catchment behavior as index values, and can be predicted for ungauged catchments using a regionalization procedure. The prediction reliability is affected by data uncertainties for the gauged catchments used in prediction and by uncertainties in the regionalization procedure. We quantified signature uncertainty stemming from discharge data uncertainty for 43 UK catchments and propagated these uncertainties in signature regionalization, while accounting for regionalization uncertainty with a weighted-pooling-group approach. Discharge uncertainty was estimated using Monte Carlo sampling of multiple feasible rating curves. For each sampled rating curve, a discharge time series was calculated and used in deriving the gauged signature uncertainty distribution. We found that the gauged uncertainty varied with signature type, local measurement conditions and catchment behavior, with the highest uncertainties (median relative uncertainty ±30-40% across all catchments) for signatures measuring high- and low-flow magnitude and dynamics. Our regionalization method allowed assessing the role and relative magnitudes of the gauged and regionalized uncertainty sources in shaping the signature uncertainty distributions predicted for catchments treated as ungauged. We found that (1) if the gauged uncertainties were neglected there was a clear risk of overconditioning the regionalization inference, e.g., by attributing catchment differences resulting from gauged uncertainty to differences in catchment behavior, and (2) uncertainty in the regionalization results was lower for signatures measuring flow distribution (e.g., mean flow) than flow dynamics (e.g., autocorrelation), and for average flows (and then high flows) compared to low flows.

  4. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  5. Uncertainty in Simulating Wheat Yields Under Climate Change

    Science.gov (United States)

    Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.; hide

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.

  6. Quantify Risk to Manage Cost and Schedule

    National Research Council Canada - National Science Library

    Raymond, Fred

    1999-01-01

    Too many projects suffer from unachievable budget and schedule goals, caused by unrealistic estimates and the failure to quantify and communicate the uncertainty of these estimates to managers and sponsoring executives...

  7. Optical network scaling: roles of spectral and spatial aggregation.

    Science.gov (United States)

    Arık, Sercan Ö; Ho, Keang-Po; Kahn, Joseph M

    2014-12-01

    As the bit rates of routed data streams exceed the throughput of single wavelength-division multiplexing channels, spectral and spatial traffic aggregation become essential for optical network scaling. These aggregation techniques reduce network routing complexity by increasing spectral efficiency to decrease the number of fibers, and by increasing switching granularity to decrease the number of switching components. Spectral aggregation yields a modest decrease in the number of fibers but a substantial decrease in the number of switching components. Spatial aggregation yields a substantial decrease in both the number of fibers and the number of switching components. To quantify routing complexity reduction, we analyze the number of multi-cast and wavelength-selective switches required in a colorless, directionless and contentionless reconfigurable optical add-drop multiplexer architecture. Traffic aggregation has two potential drawbacks: reduced routing power and increased switching component size.

  8. How incorporating more data reduces uncertainty in recovery predictions

    Energy Technology Data Exchange (ETDEWEB)

    Campozana, F.P.; Lake, L.W.; Sepehrnoori, K. [Univ. of Texas, Austin, TX (United States)

    1997-08-01

    From the discovery to the abandonment of a petroleum reservoir, there are many decisions that involve economic risks because of uncertainty in the production forecast. This uncertainty may be quantified by performing stochastic reservoir modeling (SRM); however, it is not practical to apply SRM every time the model is updated to account for new data. This paper suggests a novel procedure to estimate reservoir uncertainty (and its reduction) as a function of the amount and type of data used in the reservoir modeling. Two types of data are analyzed: conditioning data and well-test data. However, the same procedure can be applied to any other data type. Three performance parameters are suggested to quantify uncertainty. SRM is performed for the following typical stages: discovery, primary production, secondary production, and infill drilling. From those results, a set of curves is generated that can be used to estimate (1) the uncertainty for any other situation and (2) the uncertainty reduction caused by the introduction of new wells (with and without well-test data) into the description.

  9. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da; Samaan, Nader A.; Makarov, Yuri V.; Huang, Zhenyu

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  10. Optimal planning and operation of aggregated distributed energy resources with market participation

    International Nuclear Information System (INIS)

    Calvillo, C.F.; Sánchez-Miralles, A.; Villar, J.; Martín, F.

    2016-01-01

    Highlights: • Price-maker optimization model for planning and operation of aggregated DER. • 3 Case studies are proposed, considering different electricity pricing scenarios. • Analysis of benefits and effect on electricity prices produced by DER aggregation. • Results showed considerable benefits even for relatively small aggregations. • Results suggest that the impact on prices should not be overlooked. - Abstract: This paper analyzes the optimal planning and operation of aggregated distributed energy resources (DER) with participation in the electricity market. Aggregators manage their portfolio of resources in order to obtain the maximum benefit from the grid, while participating in the day-ahead wholesale electricity market. The goal of this paper is to propose a model for aggregated DER systems planning, considering its participation in the electricity market and its impact on the market price. The results are the optimal planning and management of DER systems, and the appropriate energy transactions for the aggregator in the wholesale day-ahead market according to the size of its aggregated resources. A price-maker approach based on representing the market competitors with residual demand curves is followed, and the impact on the price is assessed to help in the decision of using price-maker or price-taker approaches depending on the size of the aggregated resources. A deterministic programming problem with two case studies (the average scenario and the most likely scenario from the stochastic ones), and a stochastic one with a case study to account for the market uncertainty are described. For both models, market scenarios have been built from historical data of the Spanish system. The results suggest that when the aggregated resources have enough size to follow a price-maker approach and the uncertainty of the markets is considered in the planning process, the DER systems can achieve up to 50% extra economic benefits, depending on the market

  11. Uncertainties in scaling factors for ab initio vibrational zero-point energies

    Science.gov (United States)

    Irikura, Karl K.; Johnson, Russell D.; Kacker, Raghu N.; Kessel, Rüdiger

    2009-03-01

    Vibrational zero-point energies (ZPEs) determined from ab initio calculations are often scaled by empirical factors. An empirical scaling factor partially compensates for the effects arising from vibrational anharmonicity and incomplete treatment of electron correlation. These effects are not random but are systematic. We report scaling factors for 32 combinations of theory and basis set, intended for predicting ZPEs from computed harmonic frequencies. An empirical scaling factor carries uncertainty. We quantify and report, for the first time, the uncertainties associated with scaling factors for ZPE. The uncertainties are larger than generally acknowledged; the scaling factors have only two significant digits. For example, the scaling factor for B3LYP/6-31G(d) is 0.9757±0.0224 (standard uncertainty). The uncertainties in the scaling factors lead to corresponding uncertainties in predicted ZPEs. The proposed method for quantifying the uncertainties associated with scaling factors is based upon the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. We also present a new reference set of 60 diatomic and 15 polyatomic "experimental" ZPEs that includes estimated uncertainties.

  12. Managing uncertainty in flood protection planning with climate projections

    Directory of Open Access Journals (Sweden)

    B. Dittes

    2018-04-01

    Full Text Available Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible, if they can be quantified from available catchment data, or hidden, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty, since in practical applications only a limited amount of information (e.g., a finite projection ensemble is available. We use a Bayesian approach to quantify the visible uncertainties and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is

  13. Managing uncertainty in flood protection planning with climate projections

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Schoppa, Lukas; Straub, Daniel

    2018-04-01

    Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible, if they can be quantified from available catchment data, or hidden, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty, since in practical applications only a limited amount of information (e.g., a finite projection ensemble) is available. We use a Bayesian approach to quantify the visible uncertainties and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is robust to moderate changes in

  14. Advanced LOCA code uncertainty assessment

    International Nuclear Information System (INIS)

    Wickett, A.J.; Neill, A.P.

    1990-11-01

    This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)

  15. How to live with uncertainties?

    International Nuclear Information System (INIS)

    Michel, R.

    2012-01-01

    In a short introduction, the problem of uncertainty as a general consequence of incomplete information as well as the approach to quantify uncertainty in metrology are addressed. A little history of the more than 30 years of the working group AK SIGMA is followed by an appraisal of its up-to-now achievements. Then, the potential future of the AK SIGMA is discussed based on its actual tasks and on open scientific questions and future topics. (orig.)

  16. Role of uncertainty in the basalt waste isolation project

    International Nuclear Information System (INIS)

    Knepp, A.J.; Dahlem, D.H.

    1989-01-01

    The current national Civilian Radioactive Waste Management (CRWM) Program to select a mined geologic repository will likely require the extensive use of probabilistic techniques to quantify uncertainty in predictions of repository isolation performance. Performance of nonhomogeneous, geologic hydrologic, and chemical systems must be predicted over time frames of thousands of years and therefore will likely contain significant uncertainty. A qualitative assessment of our limited ability to interrogate the site in a nondestructive manner coupled with the early stage of development in the pertinent geosciences support this statement. The success of the approach to incorporate what currently appears to be an appreciable element of uncertainty into the predictions of repository performance will play an important role in acquiring a license to operate and in establishing the level of safety associated with the concept of long-term geologic storage of nuclear waste. This paper presents a brief background on the Hanford Site and the repository program, references the sources that establish the legislative requirement to quantify uncertainties in performance predictions, and summarized the present and future program at the Hanford Site in this area. The decision to quantify significant sources of uncertainties has had a major impact on the direction of the site characterization program here at Hanford. The paper concludes with a number of observations on the impacts of this decision

  17. GENERAL: Kinetic Behaviors of Catalysis-Driven Growth of Three-Species Aggregates on Base of Exchange-Driven Aggregations

    Science.gov (United States)

    Sun, Yun-Fei; Chen, Dan; Lin, Zhen-Quan; Ke, Jian-Hong

    2009-06-01

    We propose a solvable aggregation model to mimic the evolution of population A, asset B, and the quantifiable resource C in a society. In this system, the population and asset aggregates themselves grow through self-exchanges with the rate kernels K1(k, j) = K1kj and K2(k, j) = K2kj, respectively. The actions of the population and asset aggregations on the aggregation evolution of resource aggregates are described by the population-catalyzed monomer death of resource aggregates and asset-catalyzed monomer birth of resource aggregates with the rate kernels J1(k, j) = J1k and J2(k, j) = J2k, respectively. Meanwhile, the asset and resource aggregates conjunctly catalyze the monomer birth of population aggregates with the rate kernel I1(k, i, j) = I1kiμjη, and population and resource aggregates conjunctly catalyze the monomer birth of asset aggregates with the rate kernel I2(k, i, j) = I2kivjη. The kinetic behaviors of species A, B, and C are investigated by means of the mean-field rate equation approach. The effects of the population-catalyzed death and asset-catalyzed birth on the evolution of resource aggregates based on the self-exchanges of population and asset appear in effective forms. The coefficients of the effective population-catalyzed death and the asset-catalyzed birth are expressed as J1e = J1/K1 and J2e = J2/K2, respectively. The aggregate size distribution of C species is found to be crucially dominated by the competition between the effective death and the effective birth. It satisfies the conventional scaling form, generalized scaling form, and modified scaling form in the cases of J1e J2e, respectively. Meanwhile, we also find the aggregate size distributions of populations and assets both fall into two distinct categories for different parameters μ, ν, and η: (i) When μ = ν = η = 0 and μ = ν = 0, η = 1, the population and asset aggregates obey the generalized scaling forms; and (ii) When μ = ν = 1, η = 0, and μ = ν = η = 1, the

  18. CSAU (Code Scaling, Applicability and Uncertainty)

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1989-01-01

    Best Estimate computer codes have been accepted by the U.S. Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs

  19. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  20. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  1. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  2. Estimates of bias and uncertainty in recorded external dose

    International Nuclear Information System (INIS)

    Fix, J.J.; Gilbert, E.S.; Baumgartner, W.V.

    1994-10-01

    A study is underway to develop an approach to quantify bias and uncertainty in recorded dose estimates for workers at the Hanford Site based on personnel dosimeter results. This paper focuses on selected experimental studies conducted to better define response characteristics of Hanford dosimeters. The study is more extensive than the experimental studies presented in this paper and includes detailed consideration and evaluation of other sources of bias and uncertainty. Hanford worker dose estimates are used in epidemiologic studies of nuclear workers. A major objective of these studies is to provide a direct assessment of the carcinogenic risk of exposure to ionizing radiation at low doses and dose rates. Considerations of bias and uncertainty in the recorded dose estimates are important in the conduct of this work. The method developed for use with Hanford workers can be considered an elaboration of the approach used to quantify bias and uncertainty in estimated doses for personnel exposed to radiation as a result of atmospheric testing of nuclear weapons between 1945 and 1962. This approach was first developed by a National Research Council (NRC) committee examining uncertainty in recorded film badge doses during atmospheric tests (NRC 1989). It involved quantifying both bias and uncertainty from three sources (i.e., laboratory, radiological, and environmental) and then combining them to obtain an overall assessment. Sources of uncertainty have been evaluated for each of three specific Hanford dosimetry systems (i.e., the Hanford two-element film dosimeter, 1944-1956; the Hanford multi-element film dosimeter, 1957-1971; and the Hanford multi-element TLD, 1972-1993) used to estimate personnel dose throughout the history of Hanford operations. Laboratory, radiological, and environmental sources of bias and uncertainty have been estimated based on historical documentation and, for angular response, on selected laboratory measurements

  3. Drivers and uncertainties of forecasted range shifts for warm-water fishes under climate and land cover change

    Science.gov (United States)

    Bouska, Kristen; Whitledge, Gregory W.; Lant, Christopher; Schoof, Justin

    2018-01-01

    Land cover is an important determinant of aquatic habitat and is projected to shift with climate changes, yet climate-driven land cover changes are rarely factored into climate assessments. To quantify impacts and uncertainty of coupled climate and land cover change on warm-water fish species’ distributions, we used an ensemble model approach to project distributions of 14 species. For each species, current range projections were compared to 27 scenario-based projections and aggregated to visualize uncertainty. Multiple regression and model selection techniques were used to identify drivers of range change. Novel, or no-analogue, climates were assessed to evaluate transferability of models. Changes in total probability of occurrence ranged widely across species, from a 63% increase to a 65% decrease. Distributional gains and losses were largely driven by temperature and flow variables and underscore the importance of habitat heterogeneity and connectivity to facilitate adaptation to changing conditions. Finally, novel climate conditions were driven by mean annual maximum temperature, which stresses the importance of understanding the role of temperature on fish physiology and the role of temperature-mitigating management practices.

  4. Feasibility Assessment of the Use of Recycled Aggregates for Asphalt Mixtures

    Directory of Open Access Journals (Sweden)

    F. C. G. Martinho

    2018-05-01

    Full Text Available The use of recycled aggregates, manufactured from several by-products, to replace virgin aggregates in the production of pavement asphalt mixtures needs to be encouraged. Nevertheless, there are some concerns and uncertainties about the actual environmental, economic and mechanical performance resulting from the incorporation of recycled aggregates in asphalt mixtures. Therefore, this paper has the goal of discussing important features to help decision makers to select recycled aggregates as raw materials for asphalt mixtures. Based on the literature review carried out and the own previous experience of the authors, the article’s main findings reveal that incorporating some of the most common recycled aggregates into asphalt mixtures is feasible, even in a life-cycle analysis perspective. Although some specific technical operations are sometimes necessary when using recycled aggregates in asphalt mixtures, some benefits in terms of environmental impacts, energy use and costs are likely to be achieved, as well as in what concerns the mechanical performance of the asphalt mixtures.

  5. Uncertainty in simulating wheat yields under climate change : Letter

    NARCIS (Netherlands)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Supit, I.

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic

  6. Kinetic Behaviors of Catalysis-Driven Growth of Three-Species Aggregates on Base of Exchange-Driven Aggregations

    International Nuclear Information System (INIS)

    Sun Yunfei; Chen Dan; Lin Zhenquan; Ke Jianhong

    2009-01-01

    We propose a solvable aggregation model to mimic the evolution of population A, asset B, and the quantifiable resource C in a society. In this system, the population and asset aggregates themselves grow through self-exchanges with the rate kernels K 1 (k, j) = K 1 kj and K 2 (k, j) = K 2 kj, respectively. The actions of the population and asset aggregations on the aggregation evolution of resource aggregates are described by the population-catalyzed monomer death of resource aggregates and asset-catalyzed monomer birth of resource aggregates with the rate kernels J 1 (k, j) = J 1 k and J 2 (k, j) = J 2 k, respectively. Meanwhile, the asset and resource aggregates conjunctly catalyze the monomer birth of population aggregates with the rate kernel I 1 (k, i, j) = I 1 ki μ j η , and population and resource aggregates conjunctly catalyze the monomer birth of asset aggregates with the rate kernel I 2 (k, i, j) = I 2 ki v j η . The kinetic behaviors of species A, B, and C are investigated by means of the mean-field rate equation approach. The effects of the population-catalyzed death and asset-catalyzed birth on the evolution of resource aggregates based on the self-exchanges of population and asset appear in effective forms. The coefficients of the effective population-catalyzed death and the asset-catalyzed birth are expressed as J 1e = J 1 /K 1 and J 2e = J 2 /K 2 , respectively. The aggregate size distribution of C species is found to be crucially dominated by the competition between the effective death and the effective birth. It satisfies the conventional scaling form, generalized scaling form, and modified scaling form in the cases of J 1e 2e , J 1e = J 2e , and J 1e > J 2e , respectively. Meanwhile, we also find the aggregate size distributions of populations and assets both fall into two distinct categories for different parameters μ, ν, and η: (i) When μ = ν = η = 0 and μ = ν = 0, η = 1, the population and asset aggregates obey the generalized

  7. Needs of the CSAU uncertainty method

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  8. Long Memory, Fractional Integration, and Cross-Sectional Aggregation

    DEFF Research Database (Denmark)

    Haldrup, Niels; Vera-Valdés, Eduardo

    under certain conditions and that the aggregated series will have an autocorrelation function that exhibits hyperbolic decay. In this paper, we further analyze this phenomenon. We demonstrate that the aggregation argument leading to long memory is consistent with a wide range of definitions of long...... memory. In a simulation study we seek to quantify Granger's result and find that indeed both the time series and cross-sectional dimensions have to be rather significant to reflect the theoretical asymptotic results. Long memory can result even for moderate T,N dimensions but can vary considerably from...

  9. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  10. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses

  11. Laser tracker TSPI uncertainty quantification via centrifuge trajectory

    Science.gov (United States)

    Romero, Edward; Paez, Thomas; Brown, Timothy; Miller, Timothy

    2009-08-01

    Sandia National Laboratories currently utilizes two laser tracking systems to provide time-space-position-information (TSPI) and high speed digital imaging of test units under flight. These laser trackers have been in operation for decades under the premise of theoretical accuracies based on system design and operator estimates. Advances in optical imaging and atmospheric tracking technology have enabled opportunities to provide more precise six degree of freedom measurements from these trackers. Applying these technologies to the laser trackers requires quantified understanding of their current errors and uncertainty. It was well understood that an assortment of variables contributed to laser tracker uncertainty but the magnitude of these contributions was not quantified and documented. A series of experiments was performed at Sandia National Laboratories large centrifuge complex to quantify TSPI uncertainties of Sandia National Laboratories laser tracker III. The centrifuge was used to provide repeatable and economical test unit trajectories of a test-unit to use for TSPI comparison and uncertainty analysis. On a centrifuge, testunits undergo a known trajectory continuously with a known angular velocity. Each revolution may represent an independent test, which may be repeated many times over for magnitudes of data practical for statistical analysis. Previously these tests were performed at Sandia's rocket sled track facility but were found to be costly with challenges in the measurement ground truth TSPI. The centrifuge along with on-board measurement equipment was used to provide known ground truth position of test units. This paper discusses the experimental design and techniques used to arrive at measures of laser tracker error and uncertainty.

  12. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    Directory of Open Access Journals (Sweden)

    Artem Yankov

    2012-01-01

    Full Text Available For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor and in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.

  13. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  14. Mapping Soil Transmitted Helminths and Schistosomiasis under Uncertainty: A Systematic Review and Critical Appraisal of Evidence.

    Directory of Open Access Journals (Sweden)

    Andrea L Araujo Navas

    2016-12-01

    Full Text Available Spatial modelling of STH and schistosomiasis epidemiology is now commonplace. Spatial epidemiological studies help inform decisions regarding the number of people at risk as well as the geographic areas that need to be targeted with mass drug administration; however, limited attention has been given to propagated uncertainties, their interpretation, and consequences for the mapped values. Using currently published literature on the spatial epidemiology of helminth infections we identified: (1 the main uncertainty sources, their definition and quantification and (2 how uncertainty is informative for STH programme managers and scientists working in this domain.We performed a systematic literature search using the Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA protocol. We searched Web of Knowledge and PubMed using a combination of uncertainty, geographic and disease terms. A total of 73 papers fulfilled the inclusion criteria for the systematic review. Only 9% of the studies did not address any element of uncertainty, while 91% of studies quantified uncertainty in the predicted morbidity indicators and 23% of studies mapped it. In addition, 57% of the studies quantified uncertainty in the regression coefficients but only 7% incorporated it in the regression response variable (morbidity indicator. Fifty percent of the studies discussed uncertainty in the covariates but did not quantify it. Uncertainty was mostly defined as precision, and quantified using credible intervals by means of Bayesian approaches.None of the studies considered adequately all sources of uncertainties. We highlighted the need for uncertainty in the morbidity indicator and predictor variable to be incorporated into the modelling framework. Study design and spatial support require further attention and uncertainty associated with Earth observation data should be quantified. Finally, more attention should be given to mapping and interpreting

  15. Human perception and the uncertainty principle

    International Nuclear Information System (INIS)

    Harney, R.C.

    1976-01-01

    The concept of the uncertainty principle that position and momentum cannot be simultaneously specified to arbitrary accuracy is somewhat difficult to reconcile with experience. This note describes order-of-magnitude calculations which quantify the inadequacy of human perception with regards to direct observation of the breakdown of the trajectory concept implied by the uncertainty principle. Even with the best optical microscope, human vision is inadequate by three orders of magnitude. 1 figure

  16. Collaborative Research: Quantifying the Uncertainties of Aerosol Indirect Effects and Impacts on Decadal-Scale Climate Variability in NCAR CAM5 and CESM1

    Energy Technology Data Exchange (ETDEWEB)

    Nenes, Athanasios [Georgia Inst. of Technology, Atlanta, GA (United States)

    2017-06-23

    The goal of this proposed project is to assess the climatic importance and sensitivity of aerosol indirect effect (AIE) to cloud and aerosol processes and feedbacks, which include organic aerosol hygroscopicity, cloud condensation nuclei (CCN) activation kinetics, Giant CCN, cloud-scale entrainment, ice nucleation in mixed-phase and cirrus clouds, and treatment of subgrid variability of vertical velocity. A key objective was to link aerosol, cloud microphysics and dynamics feedbacks in CAM5 with a suite of internally consistent and integrated parameterizations that provide the appropriate degrees of freedom to capture the various aspects of the aerosol indirect effect. The proposal integrated new parameterization elements into the cloud microphysics, moist turbulence and aerosol modules used by the NCAR Community Atmospheric Model version 5 (CAM5). The CAM5 model was then used to systematically quantify the uncertainties of aerosol indirect effects through a series of sensitivity tests with present-day and preindustrial aerosol emissions. New parameterization elements were developed as a result of these efforts, and new diagnostic tools & methodologies were also developed to quantify the impacts of aerosols on clouds and climate within fully coupled models. Observations were used to constrain key uncertainties in the aerosol-cloud links. Advanced sensitivity tools were developed and implements to probe the drivers of cloud microphysical variability with unprecedented temporal and spatial scale. All these results have been published in top and high impact journals (or are in the final stages of publication). This proposal has also supported a number of outstanding graduate students.

  17. Evaluating Variability and Uncertainty of Geological Strength Index at a Specific Site

    Science.gov (United States)

    Wang, Yu; Aladejare, Adeyemi Emman

    2016-09-01

    Geological Strength Index (GSI) is an important parameter for estimating rock mass properties. GSI can be estimated from quantitative GSI chart, as an alternative to the direct observational method which requires vast geological experience of rock. GSI chart was developed from past observations and engineering experience, with either empiricism or some theoretical simplifications. The GSI chart thereby contains model uncertainty which arises from its development. The presence of such model uncertainty affects the GSI estimated from GSI chart at a specific site; it is, therefore, imperative to quantify and incorporate the model uncertainty during GSI estimation from the GSI chart. A major challenge for quantifying the GSI chart model uncertainty is a lack of the original datasets that have been used to develop the GSI chart, since the GSI chart was developed from past experience without referring to specific datasets. This paper intends to tackle this problem by developing a Bayesian approach for quantifying the model uncertainty in GSI chart when using it to estimate GSI at a specific site. The model uncertainty in the GSI chart and the inherent spatial variability in GSI are modeled explicitly in the Bayesian approach. The Bayesian approach generates equivalent samples of GSI from the integrated knowledge of GSI chart, prior knowledge and observation data available from site investigation. Equations are derived for the Bayesian approach, and the proposed approach is illustrated using data from a drill and blast tunnel project. The proposed approach effectively tackles the problem of how to quantify the model uncertainty that arises from using GSI chart for characterization of site-specific GSI in a transparent manner.

  18. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  19. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  20. A Multi-objective Model for Transmission Planning Under Uncertainties

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Wang, Qi; Ding, Yi

    2014-01-01

    The significant growth of distributed energy resources (DERs) associated with smart grid technologies has prompted excessive uncertainties in the transmission system. The most representative is the novel notation of commercial aggregator who has lighted a bright way for DERs to participate power...

  1. What do recent advances in quantifying climate and carbon cycle uncertainties mean for climate policy?

    International Nuclear Information System (INIS)

    House, Joanna I; Knorr, Wolfgang; Cornell, Sarah E; Prentice, I Colin; Huntingford, Chris; Cox, Peter M; Harris, Glen R; Jones, Chris D; Lowe, Jason A

    2008-01-01

    Global policy targets for greenhouse gas emissions reductions are being negotiated. The amount of emitted carbon dioxide remaining in the atmosphere is controlled by carbon cycle processes in the ocean and on land. These processes are themselves affected by climate. The resulting 'climate-carbon cycle feedback' has recently been quantified, but the policy implications have not. Using a scheme to emulate the range of state-of-the-art model results for climate feedback strength, including the modelled range of climate sensitivity and other key uncertainties, we analyse recent global targets. The G8 target of a 50% cut in emissions by 2050 leaves CO 2 concentrations rising rapidly, approaching 1000 ppm by 2300. The Stern Review's proposed 25% cut in emissions by 2050, continuing to an 80% cut, does in fact approach stabilization of CO 2 concentration on a policy-relevant (century) timescale, with most models projecting concentrations between 500 and 600 ppm by 2100. However concentrations continue to rise gradually. Long-term stabilization at 550 ppm CO 2 requires cuts in emissions of 81 to 90% by 2300, and more beyond as a portion of the CO 2 emitted persists for centuries to millennia. Reductions of other greenhouse gases cannot compensate for the long-term effects of emitting CO 2 .

  2. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  3. Managing structural uncertainty in health economic decision models: a discrepancy approach

    OpenAIRE

    Strong, M.; Oakley, J.; Chilcott, J.

    2012-01-01

    Healthcare resource allocation decisions are commonly informed by computer model predictions of population mean costs and health effects. It is common to quantify the uncertainty in the prediction due to uncertain model inputs, but methods for quantifying uncertainty due to inadequacies in model structure are less well developed. We introduce an example of a model that aims to predict the costs and health effects of a physical activity promoting intervention. Our goal is to develop a framewor...

  4. Uncertainties in Steric Sea Level Change Estimation During the Satellite Altimeter Era: Concepts and Practices

    Science.gov (United States)

    MacIntosh, C. R.; Merchant, C. J.; von Schuckmann, K.

    2017-01-01

    This article presents a review of current practice in estimating steric sea level change, focussed on the treatment of uncertainty. Steric sea level change is the contribution to the change in sea level arising from the dependence of density on temperature and salinity. It is a significant component of sea level rise and a reflection of changing ocean heat content. However, tracking these steric changes still remains a significant challenge for the scientific community. We review the importance of understanding the uncertainty in estimates of steric sea level change. Relevant concepts of uncertainty are discussed and illustrated with the example of observational uncertainty propagation from a single profile of temperature and salinity measurements to steric height. We summarise and discuss the recent literature on methodologies and techniques used to estimate steric sea level in the context of the treatment of uncertainty. Our conclusions are that progress in quantifying steric sea level uncertainty will benefit from: greater clarity and transparency in published discussions of uncertainty, including exploitation of international standards for quantifying and expressing uncertainty in measurement; and the development of community "recipes" for quantifying the error covariances in observations and from sparse sampling and for estimating and propagating uncertainty across spatio-temporal scales.

  5. Reusing recycled aggregates in structural concrete

    Science.gov (United States)

    Kou, Shicong

    The utilization of recycled aggregates in concrete can minimize environmental impact and reduce the consumption of natural resources in concrete applications. The aim of this thesis is to provide a scientific basis for the possible use of recycled aggregates in structure concrete by conducting a comprehensive programme of laboratory study to gain a better understanding of the mechanical, microstructure and durability properties of concrete produced with recycled aggregates. The study also explored possible techniques to of improve the properties of recycled aggregate concrete that is produced with high percentages (≧ 50%) of recycled aggregates. These techniques included: (a) using lower water-to-cement ratios in the concrete mix design; (b) using fly ash as a cement replacement or as an additional mineral admixture in the concrete mixes, and (c) precasting recycled aggregate concrete with steam curing regimes. The characteristics of the recycled aggregates produced both from laboratory and a commercially operated pilot construction and demolition (C&D) waste recycling plant were first studied. A mix proportioning procedure was then established to produce six series of concrete mixtures using different percentages of recycled coarse aggregates with and without the use of fly ash. The water-to-cement (binder) ratios of 0.55, 0.50, 0.45 and 0.40 were used. The fresh properties (including slump and bleeding) of recycled aggregate concrete (RAC) were then quantified. The effects of fly ash on the fresh and hardened properties of RAC were then studied and compared with those RAC prepared with no fly ash addition. Furthermore, the effects of steam curing on the hardened properties of RAC were investigated. For micro-structural properties, the interfacial transition zones of the aggregates and the mortar/cement paste were analyzed by SEM and EDX-mapping. Moreover, a detailed set of results on the fracture properties for RAC were obtained. Based on the experimental

  6. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  7. A fuzzy stochastic framework for managing hydro-environmental and socio-economic interactions under uncertainty

    Science.gov (United States)

    Subagadis, Yohannes Hagos; Schütze, Niels; Grundmann, Jens

    2014-05-01

    An amplified interconnectedness between a hydro-environmental and socio-economic system brings about profound challenges of water management decision making. In this contribution, we present a fuzzy stochastic approach to solve a set of decision making problems, which involve hydrologically, environmentally, and socio-economically motivated criteria subjected to uncertainty and ambiguity. The proposed methodological framework combines objective and subjective criteria in a decision making procedure for obtaining an acceptable ranking in water resources management alternatives under different type of uncertainty (subjective/objective) and heterogeneous information (quantitative/qualitative) simultaneously. The first step of the proposed approach involves evaluating the performance of alternatives with respect to different types of criteria. The ratings of alternatives with respect to objective and subjective criteria are evaluated by simulation-based optimization and fuzzy linguistic quantifiers, respectively. Subjective and objective uncertainties related to the input information are handled through linking fuzziness and randomness together. Fuzzy decision making helps entail the linguistic uncertainty and a Monte Carlo simulation process is used to map stochastic uncertainty. With this framework, the overall performance of each alternative is calculated using an Order Weighted Averaging (OWA) aggregation operator accounting for decision makers' experience and opinions. Finally, ranking is achieved by conducting pair-wise comparison of management alternatives. This has been done on the basis of the risk defined by the probability of obtaining an acceptable ranking and mean difference in total performance for the pair of management alternatives. The proposed methodology is tested in a real-world hydrosystem, to find effective and robust intervention strategies for the management of a coastal aquifer system affected by saltwater intrusion due to excessive groundwater

  8. Evaluation of uncertainty of adaptive radiation therapy

    International Nuclear Information System (INIS)

    Garcia Molla, R.; Gomez Martin, C.; Vidueira, L.; Juan-Senabre, X.; Garcia Gomez, R.

    2013-01-01

    This work is part of tests to perform to its acceptance in the clinical practice. The uncertainties of adaptive radiation, and which will separate the study, can be divided into two large parts: dosimetry in the CBCT and RDI. At each stage, their uncertainties are quantified and a level of action from which it would be reasonable to adapt the plan may be obtained with the total. (Author)

  9. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  10. Uncertainty Analysis of Spectral Irradiance Reference Standards Used for NREL Calibrations

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Andreas, A.; Reda, I.; Campanelli, M.; Stoffel, T.

    2013-05-01

    Spectral irradiance produced by lamp standards such as the National Institute of Standards and Technology (NIST) FEL-type tungsten halogen lamps are used to calibrate spectroradiometers at the National Renewable Energy Laboratory. Spectroradiometers are often used to characterize spectral irradiance of solar simulators, which in turn are used to characterize photovoltaic device performance, e.g., power output and spectral response. Therefore, quantifying the calibration uncertainty of spectroradiometers is critical to understanding photovoltaic system performance. In this study, we attempted to reproduce the NIST-reported input variables, including the calibration uncertainty in spectral irradiance for a standard NIST lamp, and quantify uncertainty for measurement setup at the Optical Metrology Laboratory at the National Renewable Energy Laboratory.

  11. Quantifying uncertainty for predictions with model error in non-Gaussian systems with intermittency

    International Nuclear Information System (INIS)

    Branicki, Michal; Majda, Andrew J

    2012-01-01

    This paper discusses a range of important mathematical issues arising in applications of a newly emerging stochastic-statistical framework for quantifying and mitigating uncertainties associated with prediction of partially observed and imperfectly modelled complex turbulent dynamical systems. The need for such a framework is particularly severe in climate science where the true climate system is vastly more complicated than any conceivable model; however, applications in other areas, such as neural networks and materials science, are just as important. The mathematical tools employed here rely on empirical information theory and fluctuation–dissipation theorems (FDTs) and it is shown that they seamlessly combine into a concise systematic framework for measuring and optimizing consistency and sensitivity of imperfect models. Here, we utilize a simple statistically exactly solvable ‘perfect’ system with intermittent hidden instabilities and with time-periodic features to address a number of important issues encountered in prediction of much more complex dynamical systems. These problems include the role and mitigation of model error due to coarse-graining, moment closure approximations, and the memory of initial conditions in producing short, medium and long-range predictions. Importantly, based on a suite of increasingly complex imperfect models of the perfect test system, we show that the predictive skill of the imperfect models and their sensitivity to external perturbations is improved by ensuring their consistency on the statistical attractor (i.e. the climate) with the perfect system. Furthermore, the discussed link between climate fidelity and sensitivity via the FDT opens up an enticing prospect of developing techniques for improving imperfect model sensitivity based on specific tests carried out in the training phase of the unperturbed statistical equilibrium/climate. (paper)

  12. Uncertainty in simulating wheat yields under climate change

    DEFF Research Database (Denmark)

    Asseng, A; Ewert, F; Rosenzweig, C

    2013-01-01

    of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models...... than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi...

  13. Optimised performance of a plug-in electric vehicle aggregator in energy and reserve markets

    International Nuclear Information System (INIS)

    Shafie-khah, M.; Moghaddam, M.P.; Sheikh-El-Eslami, M.K.; Catalão, J.P.S.

    2015-01-01

    Highlights: • A new model is developed to optimise the performance of a PEV aggregator in the power market. • PEVs aggregator can combine the PEVs and manage the charge/discharge of their batteries. • A new approach to calculate the satisfaction/motivation of PEV owners is proposed. • Several uncertainties are taken into account using a two-stage stochastic programing approach. • The proposed model is proficient in significantly improving the short- and long-term behaviour. - Abstract: In this paper, a new model is developed to optimise the performance of a plug-in Electric Vehicle (EV) aggregator in electricity markets, considering both short- and long-term horizons. EV aggregator as a new player of the power market can aggregate the EVs and manage the charge/discharge of their batteries. The aggregator maximises the profit and optimises EV owners’ revenue by applying changes in tariffs to compete with other market players for retaining current customers and acquiring new owners. On this basis, a new approach to calculate the satisfaction/motivation of EV owners and their market participation is proposed in this paper. Moreover, the behaviour of owners to select their supplying company is considered. The aggregator optimises the self-scheduling programme and submits the best bidding/offering strategies to the day-ahead and real-time markets. To achieve this purpose, the day-ahead and real-time energy and reserve markets are modelled as oligopoly markets, in contrast with previous works that utilised perfectly competitive ones. Furthermore, several uncertainties and constraints are taken into account using a two-stage stochastic programing approach, which have not been addressed in previous works. The numerical studies show the effectiveness of the proposed model

  14. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    Science.gov (United States)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  15. Bifunctional fluorescent probes for detection of amyloid aggregates and reactive oxygen species.

    Science.gov (United States)

    Needham, Lisa-Maria; Weber, Judith; Fyfe, James W B; Kabia, Omaru M; Do, Dung T; Klimont, Ewa; Zhang, Yu; Rodrigues, Margarida; Dobson, Christopher M; Ghandi, Sonia; Bohndiek, Sarah E; Snaddon, Thomas N; Lee, Steven F

    2018-02-01

    Protein aggregation into amyloid deposits and oxidative stress are key features of many neurodegenerative disorders including Parkinson's and Alzheimer's disease. We report here the creation of four highly sensitive bifunctional fluorescent probes, capable of H 2 O 2 and/or amyloid aggregate detection. These bifunctional sensors use a benzothiazole core for amyloid localization and boronic ester oxidation to specifically detect H 2 O 2 . We characterized the optical properties of these probes using both bulk fluorescence measurements and single-aggregate fluorescence imaging, and quantify changes in their fluorescence properties upon addition of amyloid aggregates of α-synuclein and pathophysiological H 2 O 2 concentrations. Our results indicate these new probes will be useful to detect and monitor neurodegenerative disease.

  16. Bifunctional fluorescent probes for detection of amyloid aggregates and reactive oxygen species

    Science.gov (United States)

    Needham, Lisa-Maria; Weber, Judith; Fyfe, James W. B.; Kabia, Omaru M.; Do, Dung T.; Klimont, Ewa; Zhang, Yu; Rodrigues, Margarida; Dobson, Christopher M.; Ghandi, Sonia; Bohndiek, Sarah E.; Snaddon, Thomas N.; Lee, Steven F.

    2018-02-01

    Protein aggregation into amyloid deposits and oxidative stress are key features of many neurodegenerative disorders including Parkinson's and Alzheimer's disease. We report here the creation of four highly sensitive bifunctional fluorescent probes, capable of H2O2 and/or amyloid aggregate detection. These bifunctional sensors use a benzothiazole core for amyloid localization and boronic ester oxidation to specifically detect H2O2. We characterized the optical properties of these probes using both bulk fluorescence measurements and single-aggregate fluorescence imaging, and quantify changes in their fluorescence properties upon addition of amyloid aggregates of α-synuclein and pathophysiological H2O2 concentrations. Our results indicate these new probes will be useful to detect and monitor neurodegenerative disease.

  17. Uncertainty contributions to low flow projections in Austria

    Science.gov (United States)

    Parajka, J.; Blaschke, A. P.; Blöschl, G.; Haslinger, K.; Hepp, G.; Laaha, G.; Schöner, W.; Trautvetter, H.; Viglione, A.; Zessner, M.

    2015-11-01

    The main objective of the paper is to understand the contributions to the uncertainty in low flow projections resulting from hydrological model uncertainty and climate projection uncertainty. Model uncertainty is quantified by different parameterizations of a conceptual semi-distributed hydrologic model (TUWmodel) using 11 objective functions in three different decades (1976-1986, 1987-1997, 1998-2008), which allows disentangling the effect of modeling uncertainty and temporal stability of model parameters. Climate projection uncertainty is quantified by four future climate scenarios (ECHAM5-A1B, A2, B1 and HADCM3-A1B) using a delta change approach. The approach is tested for 262 basins in Austria. The results indicate that the seasonality of the low flow regime is an important factor affecting the performance of model calibration in the reference period and the uncertainty of Q95 low flow projections in the future period. In Austria, the calibration uncertainty in terms of Q95 is larger in basins with summer low flow regime than in basins with winter low flow regime. Using different calibration periods may result in a range of up to 60 % in simulated Q95 low flows. The low flow projections show an increase of low flows in the Alps, typically in the range of 10-30 % and a decrease in the south-eastern part of Austria mostly in the range -5 to -20 % for the period 2021-2050 relative the reference period 1976-2008. The change in seasonality varies between scenarios, but there is a tendency for earlier low flows in the Northern Alps and later low flows in Eastern Austria. In 85 % of the basins, the uncertainty in Q95 from model calibration is larger than the uncertainty from different climate scenarios. The total uncertainty of Q95 projections is the largest in basins with winter low flow regime and, in some basins, exceeds 60 %. In basins with summer low flows and the total uncertainty is mostly less than 20 %. While the calibration uncertainty dominates over climate

  18. Economics of climate change under uncertainty. Benefits of flexibility

    International Nuclear Information System (INIS)

    Anda, Jon; Golub, Alexander; Strukova, Elena

    2009-01-01

    The selection of climate policy has to be made in an extremely uncertain environment: both benefits and costs of a particular climate policy are unknown and in the best case could be described by the probability distribution of various outcomes. Dominated in literature, the expected value approach to the cost-benefit analysis of climate policy under uncertainties relies on the aggregated estimation of various outcomes of climate policy weighted and averaged by probabilities. The variance, skewness, and kurtosis are important characteristics of uncertainties but can be easily lost in the process of aggregation. The real option analysis (ROA) explicitly accounts for both the expected value of underling assets and the variance of the expected value (as well as skewness and kurtosis that are important to describe a fat tail phenomenon). In the paper, we propose an application of the real option analysis in order to formulate rules for the selection of a climate policy (emission target) and estimate the economic value of the future flexibility created by interim climate policy, which may be corrected in the future in response to new knowledge that hopefully reduces uncertainties. The initially selected interim policy has an option value and methodology for its valuation presented in the paper. (author)

  19. Efficient Multilevel and Multi-index Sampling Methods in Stochastic Differential Equations

    KAUST Repository

    Haji Ali, Abdul Lateef

    2016-01-01

    is to compute point or aggregate values, called “quantities of interest”. A rapidly growing research area that tries to tackle this problem is Uncertainty Quantification (UQ). As the name suggests, UQ aims to accurately quantify the uncertainty in quantities

  20. Accounting for uncertainty in marine reserve design.

    Science.gov (United States)

    Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A

    2006-01-01

    Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.

  1. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  2. Exchange rate uncertainty and deviations from Purchasing\\ud Power Parity: evidence from the G7 area

    OpenAIRE

    Arghyrou, Michael; Gregoriou, Andros; Pourpourides, Panayiotis; Cardiff University

    2009-01-01

    Arghyrou, Gregoriou and Pourpourides (2009) argue that exchange rate uncertainty causes deviations from the law of one price. We test this hypothesis on aggregate data from the G7-area. We find that exchange rate uncertainty explains to a significant degree deviations from Purchasing Power Parity.

  3. Use of Paired Simple and Complex Models to Reduce Predictive Bias and Quantify Uncertainty

    DEFF Research Database (Denmark)

    Doherty, John; Christensen, Steen

    2011-01-01

    -constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology...... of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration...... that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights...

  4. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Science.gov (United States)

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  5. Web-based access, aggregation, and visualization of future climate projections with emphasis on agricultural assessments

    Science.gov (United States)

    Villoria, Nelson B.; Elliott, Joshua; Müller, Christoph; Shin, Jaewoo; Zhao, Lan; Song, Carol

    2018-01-01

    Access to climate and spatial datasets by non-specialists is restricted by technical barriers involving hardware, software and data formats. We discuss an open-source online tool that facilitates downloading the climate data from the global circulation models used by the Inter-Sectoral Impacts Model Intercomparison Project. The tool also offers temporal and spatial aggregation capabilities for incorporating future climate scenarios in applications where spatial aggregation is important. We hope that streamlined access to these data facilitates analysis of climate related issues while considering the uncertainties derived from future climate projections and temporal aggregation choices.

  6. Stochastic scheduling of aggregators of plug-in electric vehicles for participation in energy and ancillary service markets

    International Nuclear Information System (INIS)

    Alipour, Manijeh; Mohammadi-Ivatloo, Behnam; Moradi-Dalvand, Mohammad; Zare, Kazem

    2017-01-01

    Plug-in electric vehicles are expected to play a major role in the transportation system as the environmental problems and energy crisis are being more and more urgent recently. Implementing a large number of vehicles with proper control could bring an opportunity of large storage and flexibility for power systems. The plug-in electric vehicle aggregator is responsible for providing power and controlling the charging pattern of the plug-in electric vehicles under its contracted area. This paper deals with the problem of optimal scheduling problem of plug-in electric vehicle aggregators in electricity market considering the uncertainties of market prices, availability of vehicles and status of being called by the ISO in the reserve market. The impact of the market price and reserve market uncertainties on the electric vehicle scheduling problem is characterized through a stochastic programming framework. The objective of the aggregator is to maximize its profit by charging the plug-in electric vehicles on the low price time intervals as well as participating in ancillary service markets. The operational constraints of plug-in electric vehicles and constraints of vehicle to grid are modeled in the proposed framework. An illustrative example is provided to confirm the performance of the proposed model. - Highlights: • Optimal scheduling of vehicle aggregators in electricity market has been addressed. • The operational constraints of plug-in vehicle to grid are considered. • The uncertainties of calling status in reserve market and market prices are modeled. • Vehicles' driving patterns and availability uncertainty are modeled. • The effect of risk measure weight in the vehicle to grid model has been studied.

  7. Characterization of Diesel Soot Aggregates by Scattering and Extinction Methods

    Science.gov (United States)

    Kamimoto, Takeyuki

    2006-07-01

    Characteristics of diesel soot particles sampled from diesel exhaust of a common-rail turbo-charged diesel engine are quantified by scattering and extinction diagnostics using newly build two laser-based instruments. The radius of gyration representing the aggregates size is measured by the angular distribution of scattering intensity, while the soot mass concentration is measured by a two-wavelength extinction method. An approach to estimate the refractive index of diesel soot by an analysis of the extinction and scattering data using an aggregates scattering theory is proposed.

  8. Characterization of Diesel Soot Aggregates by Scattering and Extinction Methods

    International Nuclear Information System (INIS)

    Kamimoto, Takeyuki

    2006-01-01

    Characteristics of diesel soot particles sampled from diesel exhaust of a common-rail turbo-charged diesel engine are quantified by scattering and extinction diagnostics using newly build two laser-based instruments. The radius of gyration representing the aggregates size is measured by the angular distribution of scattering intensity, while the soot mass concentration is measured by a two-wavelength extinction method. An approach to estimate the refractive index of diesel soot by an analysis of the extinction and scattering data using an aggregates scattering theory is proposed

  9. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  10. Aggregate size and architecture determine biomass activity for one-stage partial nitritation and anammox

    DEFF Research Database (Denmark)

    Vlaeminck, S.; Terada, Akihiko; Smets, Barth F.

    2010-01-01

    to the inoculation and operation of the reactors. Fluorescent in-situ hybridization (FISH) was applied on aggregate sections to quantify AerAOB and AnAOB, as well as to visualize the aggregate architecture. The activity balance of the aggregates was calculated as the nitrite accumulation rate ratio (NARR), i...... and nitrite sources (NARR, > 1.7). Large A and C aggregates were granules capable of autonomous nitrogen removal (NARR, 0.6 to 1.1) with internal AnAOB zones surrounded by an AerAOB rim. Around 50% of the autotrophic space in these granules consisted of AerAOB- and AnAOB-specific EPS. Large B aggregates were...... thin film-like nitrite sinks (NARR,

  11. Wind power forecasting accuracy and uncertainty in Finland

    Energy Technology Data Exchange (ETDEWEB)

    Holttinen, H.; Miettinen, J.; Sillanpaeae, S.

    2013-04-15

    Wind power cannot be dispatched so the production levels need to be forecasted for electricity market trading. Lower prediction errors mean lower regulation balancing costs, since relatively less energy needs to go through balance settlement. From the power system operator point of view, wind power forecast errors will impact the system net imbalances when the share of wind power increases, and more accurate forecasts mean less regulating capacity will be activated from the real time Regulating Power Market. In this publication short term forecasting of wind power is studied mainly from a wind power producer point of view. The forecast errors and imbalance costs from the day-ahead Nordic electricity markets are calculated based on real data from distributed wind power plants. Improvements to forecasting accuracy are presented using several wind forecast providers, and measures for uncertainty of the forecast are presented. Aggregation of sites lowers relative share of prediction errors considerably, up to 60%. The balancing costs were also reduced up to 60%, from 3 euro/MWh for one site to 1-1.4 euro/MWh to aggregate 24 sites. Pooling wind power production for balance settlement will be very beneficial, and larger producers who can have sites from larger geographical area will benefit in lower imbalance costs. The aggregation benefits were already significant for smaller areas, resulting in 30-40% decrease in forecast errors and 13-36% decrease in unit balancing costs, depending on the year. The resulting costs are strongly dependent on Regulating Market prices that determine the prices for the imbalances. Similar level of forecast errors resulted in 40% higher imbalance costs for 2012 compared with 2011. Combining wind forecasts from different Numerical Weather Prediction providers was studied with different combination methods for 6 sites. Averaging different providers' forecasts will lower the forecast errors by 6% for day-ahead purposes. When combining

  12. Small break LOCA RELAP5/MOD3 uncertainty quantification: Bias and uncertainty evaluation for important phenomena

    International Nuclear Information System (INIS)

    Ortiz, M.G.; Ghan, L.S.; Vogl, J.

    1991-01-01

    The Nuclear Regulatory Commission (NRC) revised the Emergency Core Cooling System (ECCS) licensing rule to allow the use of Best Estimate (BE) computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability and Uncertainty (CSAU) to evaluate BE code uncertainties. The CSAU methodology was demonstrated with a specific application to a pressurized water reactor (PWR), experiencing a postulated large break loss-of-coolant accident (LBLOCA). The current work is part of an effort to adapt and demonstrate the CSAU methodology to a small break (SB) LOCA in a PWR of B and W design using RELAP5/MOD3 as the simulation tool. The subject of this paper is the Assessment and Ranging of Parameters (Element 2 of the CSAU methodology), which determines the contribution to uncertainty of specific models in the code

  13. Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas

    Science.gov (United States)

    Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.

    In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.

  14. Characterization uncertainty and its effects on models and performance

    International Nuclear Information System (INIS)

    Rautman, C.A.; Treadway, A.H.

    1991-01-01

    Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization

  15. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  16. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  17. Resonance self-shielding effect in uncertainty quantification of fission reactor neutronics parameters

    International Nuclear Information System (INIS)

    Chiba, Go; Tsuji, Masashi; Narabayashi, Tadashi

    2014-01-01

    In order to properly quantify fission reactor neutronics parameter uncertainties, we have to use covariance data and sensitivity profiles consistently. In the present paper, we establish two consistent methodologies for uncertainty quantification: a self-shielded cross section-based consistent methodology and an infinitely-diluted cross section-based consistent methodology. With these methodologies and the covariance data of uranium-238 nuclear data given in JENDL-3.3, we quantify uncertainties of infinite neutron multiplication factors of light water reactor and fast reactor fuel cells. While an inconsistent methodology gives results which depend on the energy group structure of neutron flux and neutron-nuclide reaction cross section representation, both the consistent methodologies give fair results with no such dependences.

  18. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    International Nuclear Information System (INIS)

    Gillenwater, M.; Sussman, F.; Cohen, J.

    2007-01-01

    International policy makers and climate researchers use greenhouse gas emissions inventory estimates in a variety of ways. Because of the varied uses of the inventory data, as well as the high uncertainty surrounding some of the source category estimates, considerable effort has been devoted to understanding the causes and magnitude of uncertainty in national emissions inventories. In this paper, we focus on two aspects of the rationale for quantifying uncertainty: (1) the possible uses of the quantified uncertainty estimates for policy (e.g., as a means of adjusting inventories used to determine compliance with international commitments); and (2) the direct benefits of the process of investigating uncertainties in terms of improving inventory quality. We find that there are particular characteristics that an inventory uncertainty estimate should have if it is to be used for policy purposes: (1) it should be comparable across countries; (2) it should be relatively objective, or at least subject to review and verification; (3) it should not be subject to gaming by countries acting in their own self-interest; (4) it should be administratively feasible to estimate and use; (5) the quality of the uncertainty estimate should be high enough to warrant the additional compliance costs that its use in an adjustment factor may impose on countries; and (6) it should attempt to address all types of inventory uncertainty. Currently, inventory uncertainty estimates for national greenhouse gas inventories do not have these characteristics. For example, the information used to develop quantitative uncertainty estimates for national inventories is often based on expert judgments, which are, by definition, subjective rather than objective, and therefore difficult to review and compare. Further, the practical design of a potential factor to adjust inventory estimates using uncertainty estimates would require policy makers to (1) identify clear environmental goals; (2) define these

  19. Practical Policy Applications of Uncertainty Analysis for National Greenhouse Gas Inventories

    Energy Technology Data Exchange (ETDEWEB)

    Gillenwater, M. [Environmental Resources Trust (United States)], E-mail: mgillenwater@ert.net; Sussman, F.; Cohen, J. [ICF International (United States)

    2007-09-15

    International policy makers and climate researchers use greenhouse gas emissions inventory estimates in a variety of ways. Because of the varied uses of the inventory data, as well as the high uncertainty surrounding some of the source category estimates, considerable effort has been devoted to understanding the causes and magnitude of uncertainty in national emissions inventories. In this paper, we focus on two aspects of the rationale for quantifying uncertainty: (1) the possible uses of the quantified uncertainty estimates for policy (e.g., as a means of adjusting inventories used to determine compliance with international commitments); and (2) the direct benefits of the process of investigating uncertainties in terms of improving inventory quality. We find that there are particular characteristics that an inventory uncertainty estimate should have if it is to be used for policy purposes: (1) it should be comparable across countries; (2) it should be relatively objective, or at least subject to review and verification; (3) it should not be subject to gaming by countries acting in their own self-interest; (4) it should be administratively feasible to estimate and use; (5) the quality of the uncertainty estimate should be high enough to warrant the additional compliance costs that its use in an adjustment factor may impose on countries; and (6) it should attempt to address all types of inventory uncertainty. Currently, inventory uncertainty estimates for national greenhouse gas inventories do not have these characteristics. For example, the information used to develop quantitative uncertainty estimates for national inventories is often based on expert judgments, which are, by definition, subjective rather than objective, and therefore difficult to review and compare. Further, the practical design of a potential factor to adjust inventory estimates using uncertainty estimates would require policy makers to (1) identify clear environmental goals; (2) define these

  20. Avoiding climate change uncertainties in Strategic Environmental Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Sanne Vammen, E-mail: sannevl@plan.aau.dk [The Danish Centre for Environmental Assessment, Aalborg University-Copenhagen, A.C. Meyers Vænge 15, 2450 København SV (Denmark); Kørnøv, Lone, E-mail: lonek@plan.aau.dk [The Danish Centre for Environmental Assessment, Aalborg University, Skibbrogade 5, 1. Sal, 9000 Aalborg (Denmark); Driscoll, Patrick, E-mail: patrick@plan.aau.dk [The Danish Centre for Environmental Assessment, Aalborg University-Copenhagen, A.C. Meyers Vænge 15, 2450 København SV (Denmark)

    2013-11-15

    This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies ‘reduction’ and ‘resilience’, ‘denying’, ‘ignoring’ and ‘postponing’. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty.

  1. Avoiding climate change uncertainties in Strategic Environmental Assessment

    International Nuclear Information System (INIS)

    Larsen, Sanne Vammen; Kørnøv, Lone; Driscoll, Patrick

    2013-01-01

    This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies ‘reduction’ and ‘resilience’, ‘denying’, ‘ignoring’ and ‘postponing’. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty

  2. The effect of short-range spatial variability on soil sampling uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Perk, Marcel van der [Department of Physical Geography, Utrecht University, P.O. Box 80115, 3508 TC Utrecht (Netherlands)], E-mail: m.vanderperk@geo.uu.nl; De Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria [Agenzia per la Protezione dell' Ambiente e per i Servizi Tecnici (APAT), Servizio Laboratori, Misure ed Attivita di Campo, Via di Castel Romano, 100-00128 Roma (Italy); Fajgelj, Ales; Sansone, Umberto [International Atomic Energy Agency (IAEA), Agency' s Laboratories Seibersdorf, A-1400 Vienna (Austria); Jeran, Zvonka; Jacimovic, Radojko [Jozef Stefan Institute, Jamova 39, 1000 Ljubljana (Slovenia)

    2008-11-15

    This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.

  3. The effect of short-range spatial variability on soil sampling uncertainty.

    Science.gov (United States)

    Van der Perk, Marcel; de Zorzi, Paolo; Barbizzi, Sabrina; Belli, Maria; Fajgelj, Ales; Sansone, Umberto; Jeran, Zvonka; Jaćimović, Radojko

    2008-11-01

    This paper aims to quantify the soil sampling uncertainty arising from the short-range spatial variability of elemental concentrations in the topsoils of agricultural, semi-natural, and contaminated environments. For the agricultural site, the relative standard sampling uncertainty ranges between 1% and 5.5%. For the semi-natural area, the sampling uncertainties are 2-4 times larger than in the agricultural area. The contaminated site exhibited significant short-range spatial variability in elemental composition, which resulted in sampling uncertainties of 20-30%.

  4. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... data are collected from published scientific research. The bias and the root-mean-square error, as well as the scatter index, are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example, this paper presents how the quantified...... uncertainties can be implemented in probabilistic reliability assessments....

  5. Generation and exploration of aggregation abstractions for scheduling and resource allocation

    Science.gov (United States)

    Lowry, Michael R.; Linden, Theodore A.

    1993-01-01

    This paper presents research on the abstraction of computational theories for scheduling and resource allocation. The paper describes both theory and methods for the automated generation of aggregation abstractions and approximations in which detailed resource allocation constraints are replaced by constraints between aggregate demand and capacity. The interaction of aggregation abstraction generation with the more thoroughly investigated abstractions of weakening operator preconditions is briefly discussed. The purpose of generating abstract theories for aggregated demand and resources includes: answering queries about aggregate properties, such as gross feasibility; reducing computational costs by using the solution of aggregate problems to guide the solution of detailed problems; facilitating reformulating theories to approximate problems for which there are efficient problem-solving methods; and reducing computational costs of scheduling by providing more opportunities for variable and value-ordering heuristics to be effective. Experiments are being developed to characterize the properties of aggregations that make them cost effective. Both abstract and concrete theories are represented in a variant of first-order predicate calculus, which is a parameterized multi-sorted logic that facilitates specification of large problems. A particular problem is conceptually represented as a set of ground sentences that is consistent with a quantified theory.

  6. Short-lived, transitory cell-cell interactions foster migration-dependent aggregation.

    Directory of Open Access Journals (Sweden)

    Melissa D Pope

    Full Text Available During embryonic development, motile cells aggregate into cohesive groups, which give rise to tissues and organs. The role of cell migration in regulating aggregation is unclear. The current paradigm for aggregation is based on an equilibrium model of differential cell adhesivity to neighboring cells versus the underlying substratum. In many biological contexts, however, dynamics is critical. Here, we provide evidence that multicellular aggregation dynamics involves both local adhesive interactions and transport by cell migration. Using time-lapse video microscopy, we quantified the duration of cell-cell contacts among migrating cells that collided and adhered to another cell. This lifetime of cell-cell interactions exhibited a monotonic decreasing dependence on substratum adhesivity. Parallel quantitative measurements of cell migration speed revealed that across the tested range of adhesive substrata, the mean time needed for cells to migrate and encounter another cell was greater than the mean adhesion lifetime, suggesting that aggregation dynamics may depend on cell motility instead of the local differential adhesivity of cells. Consistent with this hypothesis, aggregate size exhibited a biphasic dependence on substratum adhesivity, matching the trend we observed for cell migration speed. Our findings suggest a new role for cell motility, alongside differential adhesion, in regulating developmental aggregation events and motivate new design principles for tuning aggregation dynamics in tissue engineering applications.

  7. Leaf area index uncertainty estimates for model-data fusion applications

    Science.gov (United States)

    Andrew D. Richardson; D. Bryan Dail; D.Y. Hollinger

    2011-01-01

    Estimates of data uncertainties are required to integrate different observational data streams as model constraints using model-data fusion. We describe an approach with which random and systematic uncertainties in optical measurements of leaf area index [LAI] can be quantified. We use data from a measurement campaign at the spruce-dominated Howland Forest AmeriFlux...

  8. Uncertainties in projecting climate-change impacts in marine ecosystems

    DEFF Research Database (Denmark)

    Payne, Mark; Barange, Manuel; Cheung, William W. L.

    2016-01-01

    with a projection and building confidence in its robustness. We review how uncertainties in such projections are handled in marine science. We employ an approach developed in climate modelling by breaking uncertainty down into (i) structural (model) uncertainty, (ii) initialization and internal variability......Projections of the impacts of climate change on marine ecosystems are a key prerequisite for the planning of adaptation strategies, yet they are inevitably associated with uncertainty. Identifying, quantifying, and communicating this uncertainty is key to both evaluating the risk associated...... and highlight the opportunities and challenges associated with doing a better job. We find that even within a relatively small field such as marine science, there are substantial differences between subdisciplines in the degree of attention given to each type of uncertainty. We find that initialization...

  9. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  10. Quantifying uncertainty in the impacts of climate change on river discharge in sub-catchments of the Yangtze and Yellow River Basins, China

    Directory of Open Access Journals (Sweden)

    H. Xu

    2011-01-01

    Full Text Available Quantitative evaluations of the impacts of climate change on water resources are primarily constrained by uncertainty in climate projections from GCMs. In this study we assess uncertainty in the impacts of climate change on river discharge in two catchments of the Yangtze and Yellow River Basins that feature contrasting climate regimes (humid and semi-arid. Specifically we quantify uncertainty associated with GCM structure from a subset of CMIP3 AR4 GCMs (HadCM3, HadGEM1, CCSM3.0, IPSL, ECHAM5, CSIRO, CGCM3.1, SRES emissions scenarios (A1B, A2, B1, B2 and prescribed increases in global mean air temperature (1 °C to 6 °C. Climate projections, applied to semi-distributed hydrological models (SWAT 2005 in both catchments, indicate trends toward warmer and wetter conditions. For prescribed warming scenarios of 1 °C to 6 °C, linear increases in mean annual river discharge, relative to baseline (1961–1990, for the River Xiangxi and River Huangfuchuan are +9% and 11% per +1 °C respectively. Intra-annual changes include increases in flood (Q05 discharges for both rivers as well as a shift in the timing of flood discharges from summer to autumn and a rise (24 to 93% in dry season (Q95 discharge for the River Xiangxi. Differences in projections of mean annual river discharge between SRES emission scenarios using HadCM3 are comparatively minor for the River Xiangxi (13 to 17% rise from baseline but substantial (73 to 121% for the River Huangfuchuan. With one minor exception of a slight (−2% decrease in river discharge projected using HadGEM1 for the River Xiangxi, mean annual river discharge is projected to increase in both catchments under both the SRES A1B emission scenario and 2° rise in global mean air temperature using all AR4 GCMs on the CMIP3 subset. For the River Xiangxi, there is substantial uncertainty associated with GCM structure in the magnitude of the rise in flood (Q05 discharges (−1 to 41% under SRES A1B and −3 to 41% under 2

  11. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  12. Ignoring correlation in uncertainty and sensitivity analysis in life cycle assessment: what is the risk?

    Energy Technology Data Exchange (ETDEWEB)

    Groen, E.A., E-mail: Evelyne.Groen@gmail.com [Wageningen University, P.O. Box 338, Wageningen 6700 AH (Netherlands); Heijungs, R. [Vrije Universiteit Amsterdam, De Boelelaan 1105, Amsterdam 1081 HV (Netherlands); Leiden University, Einsteinweg 2, Leiden 2333 CC (Netherlands)

    2017-01-15

    Life cycle assessment (LCA) is an established tool to quantify the environmental impact of a product. A good assessment of uncertainty is important for making well-informed decisions in comparative LCA, as well as for correctly prioritising data collection efforts. Under- or overestimation of output uncertainty (e.g. output variance) will lead to incorrect decisions in such matters. The presence of correlations between input parameters during uncertainty propagation, can increase or decrease the the output variance. However, most LCA studies that include uncertainty analysis, ignore correlations between input parameters during uncertainty propagation, which may lead to incorrect conclusions. Two approaches to include correlations between input parameters during uncertainty propagation and global sensitivity analysis were studied: an analytical approach and a sampling approach. The use of both approaches is illustrated for an artificial case study of electricity production. Results demonstrate that both approaches yield approximately the same output variance and sensitivity indices for this specific case study. Furthermore, we demonstrate that the analytical approach can be used to quantify the risk of ignoring correlations between input parameters during uncertainty propagation in LCA. We demonstrate that: (1) we can predict if including correlations among input parameters in uncertainty propagation will increase or decrease output variance; (2) we can quantify the risk of ignoring correlations on the output variance and the global sensitivity indices. Moreover, this procedure requires only little data. - Highlights: • Ignoring correlation leads to under- or overestimation of the output variance. • We demonstrated that the risk of ignoring correlation can be quantified. • The procedure proposed is generally applicable in life cycle assessment. • In some cases, ignoring correlation has a minimal effect on decision-making tools.

  13. Planning Under Uncertainty for Aggregated Electric Vehicle Charging with Renewable Energy Supply

    NARCIS (Netherlands)

    Walraven, E.M.P.; Spaan, M.T.J.; Kaminka, Gal A.; Fox, Maria; Bouquet, Paolo; Hüllermeier, Eyke; Dignum, Virginia; Dignum, Frank; van Harmelen, Frank

    2016-01-01

    Renewable energy sources introduce uncertainty regarding generated power in smart grids. For instance, power that is generated by wind turbines is time-varying and dependent on the weather. Electric vehicles will become increasingly important in the development of smart grids with a high penetration

  14. Simulating and quantifying legacy topographic data uncertainty: an initial step to advancing topographic change analyses

    Science.gov (United States)

    Wasklewicz, Thad; Zhu, Zhen; Gares, Paul

    2017-12-01

    Rapid technological advances, sustained funding, and a greater recognition of the value of topographic data have helped develop an increasing archive of topographic data sources. Advances in basic and applied research related to Earth surface changes require researchers to integrate recent high-resolution topography (HRT) data with the legacy datasets. Several technical challenges and data uncertainty issues persist to date when integrating legacy datasets with more recent HRT data. The disparate data sources required to extend the topographic record back in time are often stored in formats that are not readily compatible with more recent HRT data. Legacy data may also contain unknown error or unreported error that make accounting for data uncertainty difficult. There are also cases of known deficiencies in legacy datasets, which can significantly bias results. Finally, scientists are faced with the daunting challenge of definitively deriving the extent to which a landform or landscape has or will continue to change in response natural and/or anthropogenic processes. Here, we examine the question: how do we evaluate and portray data uncertainty from the varied topographic legacy sources and combine this uncertainty with current spatial data collection techniques to detect meaningful topographic changes? We view topographic uncertainty as a stochastic process that takes into consideration spatial and temporal variations from a numerical simulation and physical modeling experiment. The numerical simulation incorporates numerous topographic data sources typically found across a range of legacy data to present high-resolution data, while the physical model focuses on more recent HRT data acquisition techniques. Elevation uncertainties observed from anchor points in the digital terrain models are modeled using "states" in a stochastic estimator. Stochastic estimators trace the temporal evolution of the uncertainties and are natively capable of incorporating sensor

  15. RESONANCE SELF-SHIELDING EFFECT IN UNCERTAINTY QUANTIFICATION OF FISSION REACTOR NEUTRONICS PARAMETERS

    Directory of Open Access Journals (Sweden)

    GO CHIBA

    2014-06-01

    Full Text Available In order to properly quantify fission reactor neutronics parameter uncertainties, we have to use covariance data and sensitivity profiles consistently. In the present paper, we establish two consistent methodologies for uncertainty quantification: a self-shielded cross section-based consistent methodology and an infinitely-diluted cross section-based consistent methodology. With these methodologies and the covariance data of uranium-238 nuclear data given in JENDL-3.3, we quantify uncertainties of infinite neutron multiplication factors of light water reactor and fast reactor fuel cells. While an inconsistent methodology gives results which depend on the energy group structure of neutron flux and neutron-nuclide reaction cross section representation, both the consistent methodologies give fair results with no such dependences.

  16. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  17. Quantifying climate risk - the starting point

    International Nuclear Information System (INIS)

    Fairweather, Helen; Luo, Qunying; Liu, De Li; Wiles, Perry

    2007-01-01

    Full text: All natural systems have evolved to their current state as a result inter alia of the climate in which they developed. Similarly, man-made systems (such as agricultural production) have developed to suit the climate experienced over the last 100 or so years. The capacity of different systems to adapt to changes in climate that are outside those that have been experienced previously is largely unknown. This results in considerable uncertainty when predicting climate change impacts. However, it is possible to quantify the relative probabilities of a range of potential impacts of climate change. Quantifying current climate risks is an effective starting point for analysing the probable impacts of future climate change and guiding the selection of appropriate adaptation strategies. For a farming system to be viable within the current climate, its profitability must be sustained and, therefore, possible adaptation strategies need to be tested for continued viability in a changed climate. The methodology outlined in this paper examines historical patterns of key climate variables (rainfall and temperature) across the season and their influence on the productivity of wheat growing in NSW. This analysis is used to identify the time of year that the system is most vulnerable to climate variation, within the constraints of the current climate. Wheat yield is used as a measure of productivity, which is also assumed to be a surrogate for profitability. A time series of wheat yields is sorted into ascending order and categorised into five percentile groupings (i.e. 20th, 40th, 60th and 80th percentiles) for each shire across NSW (-100 years). Five time series of climate data (which are aggregated daily data from the years in each percentile) are analysed to determine the period that provides the greatest climate risk to the production system. Once this period has been determined, this risk is quantified in terms of the degree of separation of the time series

  18. Methodology for the assessment of measuring uncertainties of articulated arm coordinate measuring machines

    International Nuclear Information System (INIS)

    Romdhani, Fekria; Hennebelle, François; Ge, Min; Juillion, Patrick; Fontaine, Jean François; Coquet, Richard

    2014-01-01

    Articulated Arm Coordinate Measuring Machines (AACMMs) have gradually evolved and are increasingly used in mechanical industry. At present, measurement uncertainties relating to the use of these devices are not yet well quantified. The work carried out consists of determining the measurement uncertainties of a mechanical part by an AACMM. The studies aiming to develop a model of measurement uncertainty are based on the Monte Carlo method developed in Supplement 1 of the Guide to Expression of Uncertainty in Measurement [1] but also identifying and characterizing the main sources of uncertainty. A multi-level Monte Carlo approach principle has been developed which allows for characterizing the possible evolution of the AACMM during the measurement and quantifying in a second level the uncertainty on the considered measurand. The first Monte Carlo level is the most complex and is thus divided into three sub-levels, namely characterization on the positioning error of a point, estimation of calibration errors and evaluation of fluctuations of the ‘localization point’. The global method is thus presented and results of the first sub-level are particularly developed. The main sources of uncertainty, including AACMM deformations, are exposed. (paper)

  19. Mobility Analysis for Inter-Site Carrier Aggregation in LTE Heterogeneous Networks

    DEFF Research Database (Denmark)

    Barbera, Simone; Pedersen, Klaus I.; Michaelsen, Per Henrik

    2013-01-01

    In this paper we analyze the mobility performance for an LTE Heterogeneous Network with macro and pico cells deployed on different carriers. Cases with/without downlink inter-site carrier aggregation are investigated. Extensive system level simulations are exploited to quantify the performance...

  20. Habitable zone dependence on stellar parameter uncertainties

    International Nuclear Information System (INIS)

    Kane, Stephen R.

    2014-01-01

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  1. Habitable zone dependence on stellar parameter uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kane, Stephen R., E-mail: skane@sfsu.edu [Department of Physics and Astronomy, San Francisco State University, 1600 Holloway Avenue, San Francisco, CA 94132 (United States)

    2014-02-20

    An important property of exoplanetary systems is the extent of the Habitable Zone (HZ), defined as that region where water can exist in a liquid state on the surface of a planet with sufficient atmospheric pressure. Both ground- and space-based observations have revealed a plethora of confirmed exoplanets and exoplanetary candidates, most notably from the Kepler mission using the transit detection technique. Many of these detected planets lie within the predicted HZ of their host star. However, as is the case with the derived properties of the planets themselves, the HZ boundaries depend on how well we understand the host star. Here we quantify the uncertainties of HZ boundaries on the parameter uncertainties of the host star. We examine the distribution of stellar parameter uncertainties from confirmed exoplanet hosts and Kepler candidate hosts and translate these into HZ boundary uncertainties. We apply this to several known systems with an HZ planet to determine the uncertainty in their HZ status.

  2. Pragmatic aspects of uncertainty propagation: A conceptual review

    KAUST Repository

    Thacker, W.Carlisle; Iskandarani, Mohamad; Gonç alves, Rafael C.; Srinivasan, Ashwanth; Knio, Omar

    2015-01-01

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  3. Pragmatic aspects of uncertainty propagation: A conceptual review

    KAUST Repository

    Thacker, W.Carlisle

    2015-09-11

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  4. Aggregations of brittle stars can perform similar ecological roles as mussel reefs

    KAUST Repository

    Geraldi, NR; Bertolini, C; Emmerson, MC; Roberts, D; Sigwart, JD; O’ Connor, NE

    2016-01-01

    considered. We quantified the abundance of sessile horse mussels Modiolus modiolus and aggregating brittle stars Ophiothrix fragilis and tested for correlations between the density of mussels (live and dead) and brittle stars each with (1) abundance, biomass

  5. Managing project risks and uncertainties

    Directory of Open Access Journals (Sweden)

    Mike Mentis

    2015-01-01

    Full Text Available This article considers threats to a project slipping on budget, schedule and fit-for-purpose. Threat is used here as the collective for risks (quantifiable bad things that can happen and uncertainties (poorly or not quantifiable bad possible events. Based on experience with projects in developing countries this review considers that (a project slippage is due to uncertainties rather than risks, (b while eventuation of some bad things is beyond control, managed execution and oversight are still the primary means to keeping within budget, on time and fit-for-purpose, (c improving project delivery is less about bigger and more complex and more about coordinated focus, effectiveness and developing thought-out heuristics, and (d projects take longer and cost more partly because threat identification is inaccurate, the scope of identified threats is too narrow, and the threat assessment product is not integrated into overall project decision-making and execution. Almost by definition, what is poorly known is likely to cause problems. Yet it is not just the unquantifiability and intangibility of uncertainties causing project slippage, but that they are insufficiently taken into account in project planning and execution that cause budget and time overruns. Improving project performance requires purpose-driven and managed deployment of scarce seasoned professionals. This can be aided with independent oversight by deeply experienced panelists who contribute technical insights and can potentially show that diligence is seen to be done.

  6. Uncertainty propagation through dynamic models of assemblies of mechanical structures

    International Nuclear Information System (INIS)

    Daouk, Sami

    2016-01-01

    When studying the behaviour of mechanical systems, mathematical models and structural parameters are usually considered deterministic. Return on experience shows however that these elements are uncertain in most cases, due to natural variability or lack of knowledge. Therefore, quantifying the quality and reliability of the numerical model of an industrial assembly remains a major question in low-frequency dynamics. The purpose of this thesis is to improve the vibratory design of bolted assemblies through setting up a dynamic connector model that takes account of different types and sources of uncertainty on stiffness parameters, in a simple, efficient and exploitable in industrial context. This work has been carried out in the framework of the SICODYN project, led by EDF R and D, that aims to characterise and quantify, numerically and experimentally, the uncertainties in the dynamic behaviour of bolted industrial assemblies. Comparative studies of several numerical methods of uncertainty propagation demonstrate the advantage of using the Lack-Of-Knowledge theory. An experimental characterisation of uncertainties in bolted structures is performed on a dynamic test rig and on an industrial assembly. The propagation of many small and large uncertainties through different dynamic models of mechanical assemblies leads to the assessment of the efficiency of the Lack-Of-Knowledge theory and its applicability in an industrial environment. (author)

  7. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  8. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  9. Stabilized fiber-reinforced pavement base course with recycled aggregate

    Science.gov (United States)

    Sobhan, Khaled

    This study evaluates the benefits to be gained by using a composite highway base course material consisting of recycled crushed concrete aggregate, portland cement, fly ash, and a modest amount of reinforcing fibers. The primary objectives of this research were to (a) quantify the improvement that is obtained by adding fibers to a lean concrete composite (made from recycled aggregate and low quantities of Portland cement and/or fly ash), (b) evaluate the mechanical behavior of such a composite base course material under both static and repeated loads, and (c) utilize the laboratory-determined properties with a mechanistic design method to assess the potential advantages. The split tensile strength of a stabilized recycled aggregate base course material was found to be exponentially related to the compacted dry density of the mix. A lean mix containing 4% cement and 4% fly ash (by weight) develops sufficient unconfined compressive, split tensile, and flexural strengths to be used as a high quality stabilized base course. The addition of 4% (by weight) of hooked-end steel fibers significantly enhances the post-peak load-deformation response of the composite in both indirect tension and static flexure. The flexural fatigue behavior of the 4% cement-4% fly ash mix is comparable to all commonly used stabilized materials, including regular concrete; the inclusion of 4% hooked-end fibers to this mix significantly improves its resistance to fatigue failure. The resilient moduli of stabilized recycled aggregate in flexure are comparable to the values obtained for traditional soil-cement mixes. In general, the fibers are effective in retarding the rate of fatigue damage accumulation, which is quantified in terms of a damage index defined by an energy-based approach. The thickness design curves for a stabilized recycled aggregate base course, as developed by using an elastic layer approach, is shown to be in close agreement with a theoretical model (based on Westergaard

  10. Comparing Fuzzy Sets and Random Sets to Model the Uncertainty of Fuzzy Shorelines

    NARCIS (Netherlands)

    Dewi, Ratna Sari; Bijker, Wietske; Stein, Alfred

    2017-01-01

    This paper addresses uncertainty modelling of shorelines by comparing fuzzy sets and random sets. Both methods quantify extensional uncertainty of shorelines extracted from remote sensing images. Two datasets were tested: pan-sharpened Pleiades with four bands (Pleiades) and pan-sharpened Pleiades

  11. Characterization of dispersed and aggregated Al2O3 morphologies for predicting nanofluid thermal conductivities

    International Nuclear Information System (INIS)

    Feng Xuemei; Johnson, Drew W.

    2013-01-01

    Nanofluids are reported to have enhanced thermal conductivities resulting from nanoparticle aggregation. The goal of this study was to explore through experimental measurements, dispersed and aggregated morphology effects on enhanced thermal conductivities for Al 2 O 3 nanoparticles with a primary size of 54.2 ± 2.0 nm. Aggregation effects were investigated by measuring thermal conductivity of different particle morphologies that occurred under different aggregation conditions. Fractal dimensions and aspect ratios were used to quantify the aggregation morphologies. Fractal dimensions were measured using static light scattering and imaging techniques. Aspect ratios were measured using dynamic light scattering, scanning electron microscopy, and atomic force microscopy. Results showed that the enhancements in thermal conductivity can be predicted with effective medium theory when aspect ratio was considered.

  12. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  13. Quantification of aggregate grain shape characteristics using 3-D laser scanning technology

    CSIR Research Space (South Africa)

    Mgangira, Martin B

    2013-07-01

    Full Text Available to identify the differences between individual aggregates. It was possible to quantify differences in particle shape characteristics at the small particle scale. The study has demonstrated the advantages of the innovative 3-D laser scanning technology...

  14. Quantifying the Contribution of Post-Processing in Computed Tomography Measurement Uncertainty

    DEFF Research Database (Denmark)

    Stolfi, Alessandro; Thompson, Mary Kathryn; Carli, Lorenzo

    2016-01-01

    by calculating the standard deviation of 10 repeated measurement evaluations on the same data set. The evaluations were performed on an industrial assembly. Each evaluation includes several dimensional and geometrical measurands that were expected to have different responses to the various post......-processing settings. It was found that the definition of the datum system had the largest impact on the uncertainty with a standard deviation of a few microns. The surface determination and data fitting had smaller contributions with sub-micron repeatability....

  15. Cellular Models of Aggregation-dependent Template-directed Proteolysis to Characterize Tau Aggregation Inhibitors for Treatment of Alzheimer Disease.

    Science.gov (United States)

    Harrington, Charles R; Storey, John M D; Clunas, Scott; Harrington, Kathleen A; Horsley, David; Ishaq, Ahtsham; Kemp, Steven J; Larch, Christopher P; Marshall, Colin; Nicoll, Sarah L; Rickard, Janet E; Simpson, Michael; Sinclair, James P; Storey, Lynda J; Wischik, Claude M

    2015-04-24

    Alzheimer disease (AD) is a degenerative tauopathy characterized by aggregation of Tau protein through the repeat domain to form intraneuronal paired helical filaments (PHFs). We report two cell models in which we control the inherent toxicity of the core Tau fragment. These models demonstrate the properties of prion-like recruitment of full-length Tau into an aggregation pathway in which template-directed, endogenous truncation propagates aggregation through the core Tau binding domain. We use these in combination with dissolution of native PHFs to quantify the activity of Tau aggregation inhibitors (TAIs). We report the synthesis of novel stable crystalline leucomethylthioninium salts (LMTX®), which overcome the pharmacokinetic limitations of methylthioninium chloride. LMTX®, as either a dihydromesylate or a dihydrobromide salt, retains TAI activity in vitro and disrupts PHFs isolated from AD brain tissues at 0.16 μM. The Ki value for intracellular TAI activity, which we have been able to determine for the first time, is 0.12 μM. These values are close to the steady state trough brain concentration of methylthioninium ion (0.18 μM) that is required to arrest progression of AD on clinical and imaging end points and the minimum brain concentration (0.13 μM) required to reverse behavioral deficits and pathology in Tau transgenic mice. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  16. Practical Markov Logic Containing First-Order Quantifiers With Application to Identity Uncertainty

    National Research Council Canada - National Science Library

    Culotta, Aron; McCallum, Andrew

    2005-01-01

    .... In this paper, we present approximate inference and training methods that incrementally instantiate portions of the network as needed to enable first-order existential and universal quantifiers in Markov logic networks...

  17. Quantifying the uncertainties of China's emission inventory for industrial sources: From national to provincial and city scales

    Science.gov (United States)

    Zhao, Yu; Zhou, Yaduan; Qiu, Liping; Zhang, Jie

    2017-09-01

    A comprehensive uncertainty analysis was conducted on emission inventories for industrial sources at national (China), provincial (Jiangsu), and city (Nanjing) scales for 2012. Based on various methods and data sources, Monte-Carlo simulation was applied at sector level for national inventory, and at plant level (whenever possible) for provincial and city inventories. The uncertainties of national inventory were estimated at -17-37% (expressed as 95% confidence intervals, CIs), -21-35%, -19-34%, -29-40%, -22-47%, -21-54%, -33-84%, and -32-92% for SO2, NOX, CO, TSP (total suspended particles), PM10, PM2.5, black carbon (BC), and organic carbon (OC) emissions respectively for the whole country. At provincial and city levels, the uncertainties of corresponding pollutant emissions were estimated at -15-18%, -18-33%, -16-37%, -20-30%, -23-45%, -26-50%, -33-79%, and -33-71% for Jiangsu, and -17-22%, -10-33%, -23-75%, -19-36%, -23-41%, -28-48%, -45-82%, and -34-96% for Nanjing, respectively. Emission factors (or associated parameters) were identified as the biggest contributors to the uncertainties of emissions for most source categories except iron & steel production in the national inventory. Compared to national one, uncertainties of total emissions in the provincial and city-scale inventories were not significantly reduced for most species with an exception of SO2. For power and other industrial boilers, the uncertainties were reduced, and the plant-specific parameters played more important roles to the uncertainties. Much larger PM10 and PM2.5 emissions for Jiangsu were estimated in this provincial inventory than other studies, implying the big discrepancies on data sources of emission factors and activity data between local and national inventories. Although the uncertainty analysis of bottom-up emission inventories at national and local scales partly supported the ;top-down; estimates using observation and/or chemistry transport models, detailed investigations and

  18. Evaluation of uncertainties in benefit-cost studies of electrical power plants. II. Development and application of a procedure for quantifying environmental uncertainties of a nuclear power plant. Final report

    International Nuclear Information System (INIS)

    Sullivan, W.G.

    1977-07-01

    Steam-electric generation plants are evaluated on a benefit-cost basis. Non-economic factors in the development and application of a procedure for quantifying environmental uncertainties of a nuclear power plant are discussed. By comparing monetary costs of a particular power plant assessed in Part 1 with non-monetary values arrived at in Part 2 and using an evaluation procedure developed in this study, a proposed power plant can be selected as a preferred alternative. This procedure enables policymakers to identify the incremental advantages and disadvantages of different power plants in view of their geographic locations. The report presents the evaluation procedure on a task by task basis and shows how it can be applied to a particular power plant. Because of the lack of objective data, it draws heavily on subjectively-derived inputs of individuals who are knowledgeable about the plant being investigated. An abbreviated study at another power plant demonstrated the transferability of the general evaluation procedure. Included in the appendices are techniques for developing scoring functions and a user's manual for the Fortran IV Program

  19. Explicitly integrating parameter, input, and structure uncertainties into Bayesian Neural Networks for probabilistic hydrologic forecasting

    KAUST Repository

    Zhang, Xuesong; Liang, Faming; Yu, Beibei; Zong, Ziliang

    2011-01-01

    Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow

  20. Probabilistic Accident Consequence Uncertainty Analysis of the Food Chain Module in the COSYMA Package (invited paper)

    International Nuclear Information System (INIS)

    Brown, J.; Jones, J.A.

    2000-01-01

    This paper describes the uncertainty analysis of the food chain module of COSYMA and the uncertainty distributions on the input parameter values for the food chain model provided by the expert panels that were used for the analysis. Two expert panels were convened, covering the areas of soil and plant transfer processes and transfer to and through animals. The aggregated uncertainty distributions from the experts for the elicited variables were used in an uncertainty analysis of the food chain module of COSYMA. The main aim of the module analysis was to identify those parameters whose uncertainty makes large contributions to the overall uncertainty and so should be included in the overall analysis. (author)

  1. On Hesitant Fuzzy Reducible Weighted Bonferroni Mean and Its Generalized Form for Multicriteria Aggregation

    Directory of Open Access Journals (Sweden)

    Wei Zhou

    2014-01-01

    Full Text Available Due to convenience and powerfulness in dealing with vagueness and uncertainty of real situation, hesitant fuzzy set has received more and more attention and has been a hot research topic recently. To differently process and effectively aggregate hesitant fuzzy information and capture their interrelationship, in this paper, we propose the hesitant fuzzy reducible weighted Bonferroni mean (HFRWBM and present its four prominent characteristics, namely, reductibility, monotonicity, boundedness, and idempotency. Then, we further investigate its generalized form, that is, the generalized hesitant fuzzy reducible weighted Bonferroni mean (GHFRWBM. Based on the discussion of model parameters, some special cases of the HFRWBM and GHFRWBM are studied in detail. In addition, to deal with the situation that multicriteria have connections in hesitant fuzzy information aggregation, a three-step aggregation approach has been proposed on the basis of the HFRWBM and GHFRWBM. In the end, we apply the proposed aggregation operators to multicriteria aggregation and give an example to illustrate our results.

  2. Uncertainty in soil-structure interaction analysis arising from differences in analytical techniques

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Chen, J.C.; Johnson, J.J.

    1982-07-01

    This study addresses uncertainties arising from variations in different modeling approaches to soil-structure interaction of massive structures at a nuclear power plant. To perform a comprehensive systems analysis, it is necessary to quantify, for each phase of the traditional analysis procedure, both the realistic seismic response and the uncertainties associated with them. In this study two linear soil-structure interaction techniques were used to analyze the Zion, Illinois nuclear power plant: a direct method using the FLUSH computer program and a substructure approach using the CLASSI family of computer programs. In-structure response from two earthquakes, one real and one synthetic, was compared. Structure configurations from relatively simple to complicated multi-structure cases were analyzed. The resulting variations help quantify uncertainty in structure response due to analysis procedures

  3. Uncertainty propagation in a multiscale model of nanocrystalline plasticity

    International Nuclear Information System (INIS)

    Koslowski, M.; Strachan, Alejandro

    2011-01-01

    We characterize how uncertainties propagate across spatial and temporal scales in a physics-based model of nanocrystalline plasticity of fcc metals. Our model combines molecular dynamics (MD) simulations to characterize atomic-level processes that govern dislocation-based-plastic deformation with a phase field approach to dislocation dynamics (PFDD) that describes how an ensemble of dislocations evolve and interact to determine the mechanical response of the material. We apply this approach to a nanocrystalline Ni specimen of interest in micro-electromechanical (MEMS) switches. Our approach enables us to quantify how internal stresses that result from the fabrication process affect the properties of dislocations (using MD) and how these properties, in turn, affect the yield stress of the metallic membrane (using the PFMM model). Our predictions show that, for a nanocrystalline sample with small grain size (4 nm), a variation in residual stress of 20 MPa (typical in today's microfabrication techniques) would result in a variation on the critical resolved shear yield stress of approximately 15 MPa, a very small fraction of the nominal value of approximately 9 GPa. - Highlights: → Quantify how fabrication uncertainties affect yield stress in a microswitch component. → Propagate uncertainties in a multiscale model of single crystal plasticity. → Molecular dynamics quantifies how fabrication variations affect dislocations. → Dislocation dynamics relate variations in dislocation properties to yield stress.

  4. Uncertainty in greenhouse-gas emission scenario projections: Experiences from Mexico and South Africa

    DEFF Research Database (Denmark)

    Puig, Daniel

    This report outlines approaches to quantify the uncertainty associated with national greenhouse-gas emission scenario projections. It does so by describing practical applications of those approaches in two countries – Mexico and South Africa. The goal of the report is to promote uncertainty...

  5. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  6. Evaluation of uncertainty sources and propagation from irradiance sensors to PV yield

    OpenAIRE

    Mariottini, Francesco; Gottschalg, Ralph; Betts, Tom; Zhu, Jiang

    2018-01-01

    This work quantifies the uncertainties of a pyranometer. Sensitivity to errors is analysed regarding the effects generated by adopting different time resolutions. Estimation of irradiance measurand and error is extended throughout an annual data set. This study represents an attempt to provide a more exhaustive overview of both systematic (i.e. physical) and random uncertainties in the evaluation of pyranometer measurements. Starting from expanded uncertainty in a monitored ...

  7. Uncertainties in extreme precipitation under climate change conditions

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia

    of adaptation strategies, but these changes are subject to uncertainties. The focus of this PhD thesis is the quantification of uncertainties in changes in extreme precipitation. It addresses two of the main sources of uncertainty in climate change impact studies: regional climate models (RCMs) and statistical...... downscaling methods (SDMs). RCMs provide information on climate change at the regional scale. SDMs are used to bias-correct and downscale the outputs of the RCMs to the local scale of interest in adaptation strategies. In the first part of the study, a multi-model ensemble of RCMs from the European ENSEMBLES...... project was used to quantify the uncertainty in RCM projections over Denmark. Three aspects of the RCMs relevant for the uncertainty quantification were first identified and investigated. These are: the interdependency of the RCMs; the performance in current climate; and the change in the performance...

  8. Quantification of Uncertainty in Thermal Building Simulation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Haghighat, F.; Frier, Christian

    In order to quantify uncertainty in thermal building simulation stochastic modelling is applied on a building model. An application of stochastic differential equations is presented in Part 1 comprising a general heat balance for an arbitrary number of loads and zones in a building to determine...

  9. The rules of information aggregation and emergence of collective intelligent behavior

    International Nuclear Information System (INIS)

    Bettencourt, Luis

    2008-01-01

    Information is a peculiar quantity. Unlike matter or energy, the aggregation of knowledge from many individuals can in fact produce more (or less) information than the sum of its parts. We use the formalism of information theory to derive general principles of information aggregation and collective organization under which information pooling can be synergetic or to identify when it will be redundant. We then show how several problems of collective cognition and coordination can be understood in terms of the conditions that allow for the minimization of uncertainty (maximization of predictability) under information pooling over many individuals. We discuss in some detail how collective coordination in swarms, markets, language processing and collaborative filtering may be guided by the optimal aggregation of information over many sources and identify circumstances when these processes fail, leading e.g. to inefficient markets. The contrast to approaches to understand coordination and collaboration via traditional decision and game theory is discussed as well as the incentives to individuals and groups to find optimal information aggregation mechanisms.

  10. Metrology and process control: dealing with measurement uncertainty

    Science.gov (United States)

    Potzick, James

    2010-03-01

    Metrology is often used in designing and controlling manufacturing processes. A product sample is processed, some relevant property is measured, and the process adjusted to bring the next processed sample closer to its specification. This feedback loop can be remarkably effective for the complex processes used in semiconductor manufacturing, but there is some risk involved because measurements have uncertainty and product specifications have tolerances. There is finite risk that good product will fail testing or that faulty product will pass. Standard methods for quantifying measurement uncertainty have been presented, but the question arises: how much measurement uncertainty is tolerable in a specific case? Or, How does measurement uncertainty relate to manufacturing risk? This paper looks at some of the components inside this process control feedback loop and describes methods to answer these questions.

  11. LOFT differential pressure uncertainty analysis

    International Nuclear Information System (INIS)

    Evans, R.P.; Biladeau, G.L.; Quinn, P.A.

    1977-03-01

    A performance analysis of the LOFT differential pressure (ΔP) measurement is presented. Along with completed descriptions of test programs and theoretical studies that have been conducted on the ΔP, specific sources of measurement uncertainty are identified, quantified, and combined to provide an assessment of the ability of this measurement to satisfy the SDD 1.4.1C (June 1975) requirement of measurement of differential pressure

  12. Preventing disulfide bond formation weakens non-covalent forces among lysozyme aggregates.

    Directory of Open Access Journals (Sweden)

    Vijay Kumar Ravi

    Full Text Available Nonnative disulfide bonds have been observed among protein aggregates in several diseases like amyotrophic lateral sclerosis, cataract and so on. The molecular mechanism by which formation of such bonds promotes protein aggregation is poorly understood. Here in this work we employ previously well characterized aggregation of hen eggwhite lysozyme (HEWL at alkaline pH to dissect the molecular role of nonnative disulfide bonds on growth of HEWL aggregates. We employed time-resolved fluorescence anisotropy, atomic force microscopy and single-molecule force spectroscopy to quantify the size, morphology and non-covalent interaction forces among the aggregates, respectively. These measurements were performed under conditions when disulfide bond formation was allowed (control and alternatively when it was prevented by alkylation of free thiols using iodoacetamide. Blocking disulfide bond formation affected growth but not growth kinetics of aggregates which were ∼50% reduced in volume, flatter in vertical dimension and non-fibrillar in comparison to control. Interestingly, single-molecule force spectroscopy data revealed that preventing disulfide bond formation weakened the non-covalent interaction forces among monomers in the aggregate by at least ten fold, thereby stalling their growth and yielding smaller aggregates in comparison to control. We conclude that while constrained protein chain dynamics in correctly disulfide bonded amyloidogenic proteins may protect them from venturing into partial folded conformations that can trigger entry into aggregation pathways, aberrant disulfide bonds in non-amyloidogenic proteins (like HEWL on the other hand, may strengthen non-covalent intermolecular forces among monomers and promote their aggregation.

  13. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project

  14. Probability and uncertainty in nuclear safety decisions

    International Nuclear Information System (INIS)

    Pate-Cornell, M.E.

    1986-01-01

    In this paper, we examine some problems posed by the use of probabilities in Nuclear Safety decisions. We discuss some of the theoretical difficulties due to the collective nature of regulatory decisions, and, in particular, the calibration and the aggregation of risk information (e.g., experts opinions). We argue that, if one chooses numerical safety goals as a regulatory basis, one can reduce the constraints to an individual safety goal and a cost-benefit criterion. We show the relevance of risk uncertainties in this kind of regulatory framework. We conclude that, whereas expected values of future failure frequencies are adequate to show compliance with economic constraints, the use of a fractile (e.g., 95%) to be specified by the regulatory agency is justified to treat hazard uncertainties for the individual safety goal. (orig.)

  15. Future of Earthquake Early Warning: Quantifying Uncertainty and Making Fast Automated Decisions for Applications

    Science.gov (United States)

    Wu, Stephen

    can capture the uncertainties in EEW information and the decision process is used. This approach is called the Performance-Based Earthquake Early Warning, which is based on the PEER Performance-Based Earthquake Engineering method. Use of surrogate models is suggested to improve computational efficiency. Also, new models are proposed to add the influence of lead time into the cost-benefit analysis. For example, a value of information model is used to quantify the potential value of delaying the activation of a mitigation action for a possible reduction of the uncertainty of EEW information in the next update. Two practical examples, evacuation alert and elevator control, are studied to illustrate the ePAD framework. Potential advanced EEW applications, such as the case of multiple-action decisions and the synergy of EEW and structural health monitoring systems, are also discussed.

  16. Incorporating uncertainty in predictive species distribution modelling.

    Science.gov (United States)

    Beale, Colin M; Lennon, Jack J

    2012-01-19

    Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

  17. An index for quantifying flocking behavior.

    Science.gov (United States)

    Quera, Vicenç; Herrando, Salvador; Beltran, Francesc S; Salas, Laura; Miñano, Meritxell

    2007-12-01

    One of the classic research topics in adaptive behavior is the collective displacement of groups of organisms such as flocks of birds, schools of fish, herds of mammals, and crowds of people. However, most agent-based simulations of group behavior do not provide a quantitative index for determining the point at which the flock emerges. An index was developed of the aggregation of moving individuals in a flock and an example was provided of how it can be used to quantify the degree to which a group of moving individuals actually forms a flock.

  18. Nuclear-data uncertainty propagations in burnup calculation for the PWR assembly

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • The DRAGON 5.0 and NECP-CACTI have been implemented in UNICORN. • The effects of different neutronics methods on S&U results were quantified. • Uncertainty analysis has been applied to burnup calculation of PWR assembly. • The uncertainties of eigenvalue and few-group constants have been quantified. - Abstract: In this paper, our home-developed lattice code NECP-CACTI has been implemented into our UNICORN code to perform sensitivity and uncertainty analysis for the lattice calculations. The verified multigroup cross-section perturbation model and methods of the sensitivity and uncertainty analysis are established and applied to different lattice codes in UNICORN. As DRAGON5.0 and NECP-CACTI are available for the lattice calculations in UNICORN now, the effects of different neutronics methods (including methods for the neutron-transport and resonance self-shielding calculations) on the results of sensitivity and uncertainty analysis were studied in this paper. Based on NECP-CACTI, uncertainty analysis using the statistical sampling method has been performed to the burnup calculation for the fresh-fueled TMI-1 assembly, propagating the nuclear-data uncertainties to k_∞ and two-group constants of the lattice calculation with depletions. As results shown, for different neutronics methods, it can be observed that different methods of the neutron-transport calculation introduce no differences to the results of sensitivity and uncertainty analysis, while different methods of the resonance self-shielding calculation would impact the results. With depletions of the TMI-1 assembly, for k_∞, the relative uncertainty varies between 0.45% and 0.60%; for two-group constants, the largest variation is between 0.35% and 2.56% for vΣ_f_,_2. Moreover, the most significant contributors to the uncertainty of k_∞ and two-group constants varied with depletions are determined.

  19. Flocculation kinetics and aggregate structure of kaolinite mixtures in laminar tube flow.

    Science.gov (United States)

    Vaezi G, Farid; Sanders, R Sean; Masliyah, Jacob H

    2011-03-01

    Flocculation is commonly used in various solid-liquid separation processes in chemical and mineral industries to separate desired products or to treat waste streams. This paper presents an experimental technique to study flocculation processes in laminar tube flow. This approach allows for more realistic estimation of the shear rate to which an aggregate is exposed, as compared to more complicated shear fields (e.g. stirred tanks). A direct sampling method is used to minimize the effect of sampling on the aggregate structure. A combination of aggregate settling velocity and image analysis was used to quantify the structure of the aggregate. Aggregate size, density, and fractal dimension were found to be the most important aggregate structural parameters. The two methods used to determine aggregate fractal dimension were in good agreement. The effects of advective flow through an aggregate's porous structure and transition-regime drag coefficient on the evaluation of aggregate density were considered. The technique was applied to investigate the flocculation kinetics and the evolution of the aggregate structure of kaolin particles with an anionic flocculant under conditions similar to those of oil sands fine tailings. Aggregates were formed using a well controlled two-stage aggregation process. Detailed statistical analysis was performed to investigate the establishment of dynamic equilibrium condition in terms of aggregate size and density evolution. An equilibrium steady state condition was obtained within 90 s of the start of flocculation; after which no further change in aggregate structure was observed. Although longer flocculation times inside the shear field could conceivably cause aggregate structure conformation, statistical analysis indicated that this did not occur for the studied conditions. The results show that the technique and experimental conditions employed here produce aggregates having a well-defined, reproducible structure. Copyright © 2011

  20. Uncertainties in historical pollution data from sedimentary records from an Australian urban floodplain lake

    Science.gov (United States)

    Lintern, A.; Leahy, P.; Deletic, A.; Heijnis, H.; Zawadzki, A.; Gadd, P.; McCarthy, D.

    2018-05-01

    Sediment cores from aquatic environments can provide valuable information about historical pollution levels and sources. However, there is little understanding of the uncertainties associated with these findings. The aim of this study is to fill this knowledge gap by proposing a framework for quantifying the uncertainties in historical heavy metal pollution records reconstructed from sediment cores. This uncertainty framework consists of six sources of uncertainty: uncertainties in (1) metals analysis methods, (2) spatial variability of sediment core heavy metal profiles, (3) sub-sampling intervals, (4) the sediment chronology, (5) the assumption that metal levels in bed sediments reflect the magnitude of metal inputs into the aquatic system, and (6) post-depositional transformation of metals. We apply this uncertainty framework to an urban floodplain lake in South-East Australia (Willsmere Billabong). We find that for this site, uncertainties in historical dated heavy metal profiles can be up to 176%, largely due to uncertainties in the sediment chronology, and in the assumption that the settled heavy metal mass is equivalent to the heavy metal mass entering the aquatic system. As such, we recommend that future studies reconstructing historical pollution records using sediment cores from aquatic systems undertake an investigation of the uncertainties in the reconstructed pollution record, using the uncertainty framework provided in this study. We envisage that quantifying and understanding the uncertainties associated with the reconstructed pollution records will facilitate the practical application of sediment core heavy metal profiles in environmental management projects.

  1. Bifurcation Analysis with Aerodynamic-Structure Uncertainties by the Nonintrusive PCE Method

    Directory of Open Access Journals (Sweden)

    Linpeng Wang

    2017-01-01

    Full Text Available An aeroelastic model for airfoil with a third-order stiffness in both pitch and plunge degree of freedom (DOF and the modified Leishman–Beddoes (LB model were built and validated. The nonintrusive polynomial chaos expansion (PCE based on tensor product is applied to quantify the uncertainty of aerodynamic and structure parameters on the aerodynamic force and aeroelastic behavior. The uncertain limit cycle oscillation (LCO and bifurcation are simulated in the time domain with the stochastic PCE method. Bifurcation diagrams with uncertainties were quantified. The Monte Carlo simulation (MCS is also applied for comparison. From the current work, it can be concluded that the nonintrusive polynomial chaos expansion can give an acceptable accuracy and have a much higher calculation efficiency than MCS. For aerodynamic model, uncertainties of aerodynamic parameters affect the aerodynamic force significantly at the stage from separation to stall at upstroke and at the stage from stall to reattach at return. For aeroelastic model, both uncertainties of aerodynamic parameters and structure parameters impact bifurcation position. Structure uncertainty of parameters is more sensitive for bifurcation. When the nonlinear stall flutter and bifurcation are concerned, more attention should be paid to the separation process of aerodynamics and parameters about pitch DOF in structure.

  2. Estimating uncertainty and its temporal variation related to global climate models in quantifying climate change impacts on hydrology

    Science.gov (United States)

    Shen, Mingxi; Chen, Jie; Zhuan, Meijia; Chen, Hua; Xu, Chong-Yu; Xiong, Lihua

    2018-01-01

    Uncertainty estimation of climate change impacts on hydrology has received much attention in the research community. The choice of a global climate model (GCM) is usually considered as the largest contributor to the uncertainty of climate change impacts. The temporal variation of GCM uncertainty needs to be investigated for making long-term decisions to deal with climate change. Accordingly, this study investigated the temporal variation (mainly long-term) of uncertainty related to the choice of a GCM in predicting climate change impacts on hydrology by using multi-GCMs over multiple continuous future periods. Specifically, twenty CMIP5 GCMs under RCP4.5 and RCP8.5 emission scenarios were adapted to adequately represent this uncertainty envelope, fifty-one 30-year future periods moving from 2021 to 2100 with 1-year interval were produced to express the temporal variation. Future climatic and hydrological regimes over all future periods were compared to those in the reference period (1971-2000) using a set of metrics, including mean and extremes. The periodicity of climatic and hydrological changes and their uncertainty were analyzed using wavelet analysis, while the trend was analyzed using Mann-Kendall trend test and regression analysis. The results showed that both future climate change (precipitation and temperature) and hydrological response predicted by the twenty GCMs were highly uncertain, and the uncertainty increased significantly over time. For example, the change of mean annual precipitation increased from 1.4% in 2021-2050 to 6.5% in 2071-2100 for RCP4.5 in terms of the median value of multi-models, but the projected uncertainty reached 21.7% in 2021-2050 and 25.1% in 2071-2100 for RCP4.5. The uncertainty under a high emission scenario (RCP8.5) was much larger than that under a relatively low emission scenario (RCP4.5). Almost all climatic and hydrological regimes and their uncertainty did not show significant periodicity at the P = .05 significance

  3. Uncertainty analysis of 137Cs and 90Sr activity in borehole water from a waste disposal site

    International Nuclear Information System (INIS)

    Dafauti, Sunita; Pulhani, Vandana; Datta, D.; Hegde, A.G.

    2005-01-01

    Uncertainty quantification (UQ) is the quantitative characterization and use of uncertainty in experimental applications. There are two distinct types of uncertainty variability which can be quantified in principle using classical probability theory and lack of knowledge which requires more than classical probability theory for its quantification. Fuzzy set theory was applied to quantify the second type of uncertainty associated with the measurement of activity due to 137 Cs and 90 Sr present in bore-well water samples from a waste disposal site. The upper and lower limits of concentration were computed and it may be concluded from the analysis that the alpha cut technique of fuzzy set theory is a good nonprecise estimator of these types of bounds. (author)

  4. Quantifying the uncertainty in discharge data using hydraulic knowledge and uncertain gaugings: a Bayesian method named BaRatin

    Science.gov (United States)

    Le Coz, Jérôme; Renard, Benjamin; Bonnifait, Laurent; Branger, Flora; Le Boursicaud, Raphaël; Horner, Ivan; Mansanarez, Valentin; Lang, Michel; Vigneau, Sylvain

    2015-04-01

    River discharge is a crucial variable for Hydrology: as the output variable of most hydrologic models, it is used for sensitivity analyses, model structure identification, parameter estimation, data assimilation, prediction, etc. A major difficulty stems from the fact that river discharge is not measured continuously. Instead, discharge time series used by hydrologists are usually based on simple stage-discharge relations (rating curves) calibrated using a set of direct stage-discharge measurements (gaugings). In this presentation, we present a Bayesian approach (cf. Le Coz et al., 2014) to build such hydrometric rating curves, to estimate the associated uncertainty and to propagate this uncertainty to discharge time series. The three main steps of this approach are described: (1) Hydraulic analysis: identification of the hydraulic controls that govern the stage-discharge relation, identification of the rating curve equation and specification of prior distributions for the rating curve parameters; (2) Rating curve estimation: Bayesian inference of the rating curve parameters, accounting for the individual uncertainties of available gaugings, which often differ according to the discharge measurement procedure and the flow conditions; (3) Uncertainty propagation: quantification of the uncertainty in discharge time series, accounting for both the rating curve uncertainties and the uncertainty of recorded stage values. The rating curve uncertainties combine the parametric uncertainties and the remnant uncertainties that reflect the limited accuracy of the mathematical model used to simulate the physical stage-discharge relation. In addition, we also discuss current research activities, including the treatment of non-univocal stage-discharge relationships (e.g. due to hydraulic hysteresis, vegetation growth, sudden change of the geometry of the section, etc.). An operational version of the BaRatin software and its graphical interface are made available free of charge on

  5. Probing size-dependent electrokinetics of hematite aggregates

    Energy Technology Data Exchange (ETDEWEB)

    Kedra-Królik, Karolina; Rosso, Kevin M.; Zarzycki, Piotr

    2017-02-01

    Aqueous particle suspensions of many kinds are stabilized by the electrostatic potential developed at their surfaces from reaction with water and ions. An important and less well understood aspect of this stabilization is the dependence of the electrostatic surface potential on particle size. Surface electrostatics are typically probed by measuring particle electrophoretic mobilities and quantified in the electrokinetic potential (f), using commercially available Zeta Potential Analyzers (ZPA). Even though ZPAs provide frequency-spectra (histograms) of electrophoretic mobility and hydrodynamic diameter, typically only the maximal-intensity values are reported, despite the information in the remainder of the spectra. Here we propose a mapping procedure that inter-correlates these histograms to extract additional insight, in this case to probe particle size-dependent electrokinetics. Our method is illustrated for a suspension of prototypical iron (III) oxide (hematite, a-Fe2O3). We found that the electrophoretic mobility and f-potential are a linear function of the aggregate size. By analyzing the distribution of surface site types as a function of aggregate size we show that site coordination increases with increasing aggregate diameter. This observation explains why the acidity of the iron oxide particles decreases with increasing particle size.

  6. Aggregation of carbon dioxide sequestration storage assessment units

    Science.gov (United States)

    Blondes, Madalyn S.; Schuenemeyer, John H.; Olea, Ricardo A.; Drew, Lawrence J.

    2013-01-01

    The U.S. Geological Survey is currently conducting a national assessment of carbon dioxide (CO2) storage resources, mandated by the Energy Independence and Security Act of 2007. Pre-emission capture and storage of CO2 in subsurface saline formations is one potential method to reduce greenhouse gas emissions and the negative impact of global climate change. Like many large-scale resource assessments, the area under investigation is split into smaller, more manageable storage assessment units (SAUs), which must be aggregated with correctly propagated uncertainty to the basin, regional, and national scales. The aggregation methodology requires two types of data: marginal probability distributions of storage resource for each SAU, and a correlation matrix obtained by expert elicitation describing interdependencies between pairs of SAUs. Dependencies arise because geologic analogs, assessment methods, and assessors often overlap. The correlation matrix is used to induce rank correlation, using a Cholesky decomposition, among the empirical marginal distributions representing individually assessed SAUs. This manuscript presents a probabilistic aggregation method tailored to the correlations and dependencies inherent to a CO2 storage assessment. Aggregation results must be presented at the basin, regional, and national scales. A single stage approach, in which one large correlation matrix is defined and subsets are used for different scales, is compared to a multiple stage approach, in which new correlation matrices are created to aggregate intermediate results. Although the single-stage approach requires determination of significantly more correlation coefficients, it captures geologic dependencies among similar units in different basins and it is less sensitive to fluctuations in low correlation coefficients than the multiple stage approach. Thus, subsets of one single-stage correlation matrix are used to aggregate to basin, regional, and national scales.

  7. Parametric uncertainty in optical image modeling

    Science.gov (United States)

    Potzick, James; Marx, Egon; Davidson, Mark

    2006-10-01

    Optical photomask feature metrology and wafer exposure process simulation both rely on optical image modeling for accurate results. While it is fair to question the accuracies of the available models, model results also depend on several input parameters describing the object and imaging system. Errors in these parameter values can lead to significant errors in the modeled image. These parameters include wavelength, illumination and objective NA's, magnification, focus, etc. for the optical system, and topography, complex index of refraction n and k, etc. for the object. In this paper each input parameter is varied over a range about its nominal value and the corresponding images simulated. Second order parameter interactions are not explored. Using the scenario of the optical measurement of photomask features, these parametric sensitivities are quantified by calculating the apparent change of the measured linewidth for a small change in the relevant parameter. Then, using reasonable values for the estimated uncertainties of these parameters, the parametric linewidth uncertainties can be calculated and combined to give a lower limit to the linewidth measurement uncertainty for those parameter uncertainties.

  8. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Science.gov (United States)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  9. Best Practices of Uncertainty Estimation for the National Solar Radiation Database (NSRDB 1998-2015): Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    It is essential to apply a traceable and standard approach to determine the uncertainty of solar resource data. Solar resource data are used for all phases of solar energy conversion projects, from the conceptual phase to routine solar power plant operation, and to determine performance guarantees of solar energy conversion systems. These guarantees are based on the available solar resource derived from a measurement station or modeled data set such as the National Solar Radiation Database (NSRDB). Therefore, quantifying the uncertainty of these data sets provides confidence to financiers, developers, and site operators of solar energy conversion systems and ultimately reduces deployment costs. In this study, we implemented the Guide to the Expression of Uncertainty in Measurement (GUM) 1 to quantify the overall uncertainty of the NSRDB data. First, we start with quantifying measurement uncertainty, then we determine each uncertainty statistic of the NSRDB data, and we combine them using the root-sum-of-the-squares method. The statistics were derived by comparing the NSRDB data to the seven measurement stations from the National Oceanic and Atmospheric Administration's Surface Radiation Budget Network, National Renewable Energy Laboratory's Solar Radiation Research Laboratory, and the Atmospheric Radiation Measurement program's Southern Great Plains Central Facility, in Billings, Oklahoma. The evaluation was conducted for hourly values, daily totals, monthly mean daily totals, and annual mean monthly mean daily totals. Varying time averages assist to capture the temporal uncertainty of the specific modeled solar resource data required for each phase of a solar energy project; some phases require higher temporal resolution than others. Overall, by including the uncertainty of measurements of solar radiation made at ground stations, bias, and root mean square error, the NSRDB data demonstrated expanded uncertainty of 17 percent - 29 percent on hourly

  10. Multiple and Periodic Measurement of RBC Aggregation and ESR in Parallel Microfluidic Channels under On-Off Blood Flow Control

    Directory of Open Access Journals (Sweden)

    Yang Jun Kang

    2018-06-01

    Full Text Available Red blood cell (RBC aggregation causes to alter hemodynamic behaviors at low flow-rate regions of post-capillary venules. Additionally, it is significantly elevated in inflammatory or pathophysiological conditions. In this study, multiple and periodic measurements of RBC aggregation and erythrocyte sedimentation rate (ESR are suggested by sucking blood from a pipette tip into parallel microfluidic channels, and quantifying image intensity, especially through single experiment. Here, a microfluidic device was prepared from a master mold using the xurography technique rather than micro-electro-mechanical-system fabrication techniques. In order to consider variations of RBC aggregation in microfluidic channels due to continuous ESR in the conical pipette tip, two indices (aggregation index (AI and erythrocyte-sedimentation-rate aggregation index (EAI are evaluated by using temporal variations of microscopic, image-based intensity. The proposed method is employed to evaluate the effect of hematocrit and dextran solution on RBC aggregation under continuous ESR in the conical pipette tip. As a result, EAI displays a significantly linear relationship with modified conventional ESR measurement obtained by quantifying time constants. In addition, EAI varies linearly within a specific concentration of dextran solution. In conclusion, the proposed method is able to measure RBC aggregation under continuous ESR in the conical pipette tip. Furthermore, the method provides multiple data of RBC aggregation and ESR through a single experiment. A future study will involve employing the proposed method to evaluate biophysical properties of blood samples collected from cardiovascular diseases.

  11. A Bayesian approach for quantification of model uncertainty

    International Nuclear Information System (INIS)

    Park, Inseok; Amarchinta, Hemanth K.; Grandhi, Ramana V.

    2010-01-01

    In most engineering problems, more than one model can be created to represent an engineering system's behavior. Uncertainty is inevitably involved in selecting the best model from among the models that are possible. Uncertainty in model selection cannot be ignored, especially when the differences between the predictions of competing models are significant. In this research, a methodology is proposed to quantify model uncertainty using measured differences between experimental data and model outcomes under a Bayesian statistical framework. The adjustment factor approach is used to propagate model uncertainty into prediction of a system response. A nonlinear vibration system is used to demonstrate the processes for implementing the adjustment factor approach. Finally, the methodology is applied on the engineering benefits of a laser peening process, and a confidence band for residual stresses is established to indicate the reliability of model prediction.

  12. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....

  13. Potential of vehicle-to-grid ancillary services considering the uncertainties in plug-in electric vehicle availability and service/localization limitations in distribution grids

    International Nuclear Information System (INIS)

    Sarabi, Siyamak; Davigny, Arnaud; Courtecuisse, Vincent; Riffonneau, Yann; Robyns, Benoît

    2016-01-01

    Highlights: • The availability uncertainty of PEVs are modelled using Gaussian mixture model. • Interdependency of stochastic variables are modelled using copula function. • V2G bidding capacity is calculated using Free Pattern search optimization method. • Localization limitation is considered for V2G service potential assessment. • Competitive services for fleet of V2G-enabled PEVs are identified using fuzzy sets. - Abstract: The aim of the paper is to propose an approach for statistical assessment of the potential of plug-in electric vehicles (PEV) for vehicle-to-grid (V2G) ancillary services, where it focuses on PEVs doing daily home-work commuting. In this approach, the possible ancillary services (A/S) for each PEV fleet in terms of its available V2G power (AVP) and flexible intervals are identified. The flexible interval is calculated using a powerful stochastic global optimization technique so-called “Free Pattern Search” (FPS). A probabilistic method is also proposed to quantify the impacts of PEV’s availability uncertainty using the Gaussian mixture model (GMM), and interdependency of stochastic variables on AVP of each fleet thanks to a multivariate modeling with Copula function. Each fleet is analyzed based on its aggregated PEV numbers at different level of distribution grid, in order to satisfy the ancillary services localization limitation. A case study using the proposed approach evaluates the real potential in Niort, a city in west of France. In fact, by using the proposed approach an aggregator can analyze the V2G potential of PEVs under its contract.

  14. Uncertainty Detection for NIF Normal Pointing Images

    International Nuclear Information System (INIS)

    Awwal, A S; Law, C; Ferguson, S W

    2007-01-01

    The National Ignition Facility at the Lawrence Livermore National Laboratory when completed in 2009, will deliver 192-beams aligned precisely at the center of the target chamber producing extreme energy densities and pressures. Video images of laser beams along the beam path are used by automatic alignment algorithms to determine the position of the beams for alignment purposes. However, noise and other optical effects may affect the accuracy of the calculated beam location. Realistic estimation of the uncertainty is necessary to assure that the beam is monitored within the clear optical path. When the uncertainty is above a certain threshold the automated alignment operation is suspended and control of the beam is transferred to a human operator. This work describes our effort to quantify the uncertainty of measurement of the most common alignment beam

  15. Uncertainty quantification of surface-water/groundwater exchange estimates in large wetland systems using Python

    Science.gov (United States)

    Hughes, J. D.; Metz, P. A.

    2014-12-01

    Most watershed studies include observation-based water budget analyses to develop first-order estimates of significant flow terms. Surface-water/groundwater (SWGW) exchange is typically assumed to be equal to the residual of the sum of inflows and outflows in a watershed. These estimates of SWGW exchange, however, are highly uncertain as a result of the propagation of uncertainty inherent in the calculation or processing of the other terms of the water budget, such as stage-area-volume relations, and uncertainties associated with land-cover based evapotranspiration (ET) rate estimates. Furthermore, the uncertainty of estimated SWGW exchanges can be magnified in large wetland systems that transition from dry to wet during wet periods. Although it is well understood that observation-based estimates of SWGW exchange are uncertain it is uncommon for the uncertainty of these estimates to be directly quantified. High-level programming languages like Python can greatly reduce the effort required to (1) quantify the uncertainty of estimated SWGW exchange in large wetland systems and (2) evaluate how different approaches for partitioning land-cover data in a watershed may affect the water-budget uncertainty. We have used Python with the Numpy, Scipy.stats, and pyDOE packages to implement an unconstrained Monte Carlo approach with Latin Hypercube sampling to quantify the uncertainty of monthly estimates of SWGW exchange in the Floral City watershed of the Tsala Apopka wetland system in west-central Florida, USA. Possible sources of uncertainty in the water budget analysis include rainfall, ET, canal discharge, and land/bathymetric surface elevations. Each of these input variables was assigned a probability distribution based on observation error or spanning the range of probable values. The Monte Carlo integration process exposes the uncertainties in land-cover based ET rate estimates as the dominant contributor to the uncertainty in SWGW exchange estimates. We will discuss

  16. Understanding Climate Uncertainty with an Ocean Focus

    Science.gov (United States)

    Tokmakian, R. T.

    2009-12-01

    Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in

  17. Aggregation and fusion of modified low density lipoprotein.

    Science.gov (United States)

    Pentikäinen, M O; Lehtonen, E M; Kovanen, P T

    1996-12-01

    In atherogenesis, low density lipoprotein (LDL, diameter 22 nm) accumulates in the extracellular space of the arterial intima in the form of aggregates of lipid droplets (droplet diameter up to 400 nm). Here we studied the effects of various established in vitro LDL modifications on LDL aggregation and fusion. LDL was subjected to vortexing, oxidation by copper ions, proteolysis by alpha-chymotrypsin, lipolysis by sphingomyelinase, and nonenzymatic glycosylation, and was induced to form adducts with malondialdehyde or complexes with anti-apoB-100 antibodies. To assess the amount of enlarged LDL-derived structures formed (due to aggregation or fusion), we measured the turbidity of solutions containing modified LDL, and quantified the proportion of modified LDL that 1) sedimented at low-speed centrifugation (14,000 g), 2) floated at an increased rate at high-speed centrifugation (rate zonal flotation at 285,000 gmax), 3) were excluded in size-exclusion column chromatography (exclusion limit 40 MDa), or 4) failed to enter into 0.5%. Fast Lane agarose gel during electrophoresis. To detect whether particle fusion had contributed to the formation of the enlarged LDL-derived structures, particle morphology was examined using negative staining and thin-section transmission electron microscopy. We found that 1) aggregation was induced by the formation of LDL-antibody complexes, malondialdehyde treatment, and glycosylation of LDL; 2) fusion of LDL was induced by proteolysis of LDL by alpha-chymotrypsin; and 3) aggregation and fusion of LDL were induced by vortexing, oxidation by copper ions, and lipolysis by sphingomyclinase of LDL. The various modifications of LDL differed in their ability to induce aggregation and fusion.

  18. Uncertainty Assessment of Space-Borne Passive Soil Moisture Retrievals

    Science.gov (United States)

    Quets, Jan; De Lannoy, Gabrielle; Reichle, Rolf; Cosh, Michael; van der Schalie, Robin; Wigneron, Jean-Pierre

    2017-01-01

    The uncertainty associated with passive soil moisture retrieval is hard to quantify, and known to be underlain by various, diverse, and complex causes. Factors affecting space-borne retrieved soil moisture estimation include: (i) the optimization or inversion method applied to the radiative transfer model (RTM), such as e.g. the Single Channel Algorithm (SCA), or the Land Parameter Retrieval Model (LPRM), (ii) the selection of the observed brightness temperatures (Tbs), e.g. polarization and incidence angle, (iii) the definition of the cost function and the impact of prior information in it, and (iv) the RTM parameterization (e.g. parameterizations officially used by the SMOS L2 and SMAP L2 retrieval products, ECMWF-based SMOS assimilation product, SMAP L4 assimilation product, and perturbations from those configurations). This study aims at disentangling the relative importance of the above-mentioned sources of uncertainty, by carrying out soil moisture retrieval experiments, using SMOS Tb observations in different settings, of which some are mentioned above. The ensemble uncertainties are evaluated at 11 reference CalVal sites, over a time period of more than 5 years. These experimental retrievals were inter-compared, and further confronted with in situ soil moisture measurements and operational SMOS L2 retrievals, using commonly used skill metrics to quantify the temporal uncertainty in the retrievals.

  19. Application of Uncertainty and Sensitivity Analysis to a Kinetic Model for Enzymatic Biodiesel Production

    DEFF Research Database (Denmark)

    Price, Jason Anthony; Nordblad, Mathias; Woodley, John

    2014-01-01

    This paper demonstrates the added benefits of using uncertainty and sensitivity analysis in the kinetics of enzymatic biodiesel production. For this study, a kinetic model by Fedosov and co-workers is used. For the uncertainty analysis the Monte Carlo procedure was used to statistically quantify...

  20. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Science.gov (United States)

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  1. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  2. Uncertainty of soil erosion modelling using open source high resolution and aggregated DEMs

    Directory of Open Access Journals (Sweden)

    Arun Mondal

    2017-05-01

    Full Text Available Digital Elevation Model (DEM is one of the important parameters for soil erosion assessment. Notable uncertainties are observed in this study while using three high resolution open source DEMs. The Revised Universal Soil Loss Equation (RUSLE model has been applied to analysis the assessment of soil erosion uncertainty using open source DEMs (SRTM, ASTER and CARTOSAT and their increasing grid space (pixel size from the actual. The study area is a part of the Narmada river basin in Madhya Pradesh state, which is located in the central part of India and the area covered 20,558 km2. The actual resolution of DEMs is 30 m and their increasing grid spaces are taken as 90, 150, 210, 270 and 330 m for this study. Vertical accuracy of DEMs has been assessed using actual heights of the sample points that have been taken considering planimetric survey based map (toposheet. Elevations of DEMs are converted to the same vertical datum from WGS 84 to MSL (Mean Sea Level, before the accuracy assessment and modelling. Results indicate that the accuracy of the SRTM DEM with the RMSE of 13.31, 14.51, and 18.19 m in 30, 150 and 330 m resolution respectively, is better than the ASTER and the CARTOSAT DEMs. When the grid space of the DEMs increases, the accuracy of the elevation and calculated soil erosion decreases. This study presents a potential uncertainty introduced by open source high resolution DEMs in the accuracy of the soil erosion assessment models. The research provides an analysis of errors in selecting DEMs using the original and increased grid space for soil erosion modelling.

  3. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  4. Addressing the Conflict of Interest between Aggregators and DSOs in Deregulated Energy Markets

    DEFF Research Database (Denmark)

    Heinrich, Carsten; Ziras, Charalampos; You, Shi

    2017-01-01

    This paper investigates potential conflicts of interest between distribution system operators (DSOs) and aggregators. We propose a method to quantify the allowed operating range of residential flexible loads in a local distribution network. The calculated bounds can be used to formulate DSO...

  5. Facing uncertainty in ecosystem services-based resource management.

    Science.gov (United States)

    Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter

    2013-09-01

    The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  7. Soil Aggregate Stability and Grassland Productivity Associations in a Northern Mixed-Grass Prairie.

    Directory of Open Access Journals (Sweden)

    Kurt O Reinhart

    Full Text Available Soil aggregate stability data are often predicted to be positively associated with measures of plant productivity, rangeland health, and ecosystem functioning. Here we revisit the hypothesis that soil aggregate stability is positively associated with plant productivity. We measured local (plot-to-plot variation in grassland community composition, plant (aboveground biomass, root biomass, % water-stable soil aggregates, and topography. After accounting for spatial autocorrelation, we observed a negative association between % water-stable soil aggregates (0.25-1 and 1-2 mm size classes of macroaggregates and dominant graminoid biomass, and negative associations between the % water-stable aggregates and the root biomass of a dominant sedge (Carex filifolia. However, variation in total root biomass (0-10 or 0-30 cm depths was either negatively or not appreciably associated with soil aggregate stabilities. Overall, regression slope coefficients were consistently negative thereby indicating the general absence of a positive association between measures of plant productivity and soil aggregate stability for the study area. The predicted positive association between factors was likely confounded by variation in plant species composition. Specifically, sampling spanned a local gradient in plant community composition which was likely driven by niche partitioning along a subtle gradient in elevation. Our results suggest an apparent trade-off between some measures of plant biomass production and soil aggregate stability, both known to affect the land's capacity to resist erosion. These findings further highlight the uncertainty of plant biomass-soil stability associations.

  8. A statistical methodology for quantification of uncertainty in best estimate code physical models

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2007-01-01

    A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions

  9. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  10. Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility

    International Nuclear Information System (INIS)

    Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.

    2010-01-01

    Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure

  11. Effect of Uncertainty Parameters in Blowdown and Reflood Models for OPR1000 LBLOCA Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huh, Byung Gil; Jin, Chang Yong; Seul, Kwangwon; Hwang, Taesuk [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2014-05-15

    KINS(Korea Institute of Nuclear Safety) has also performed the audit calculation with the KINS Realistic Evaluation Methodology(KINS-REM) to confirm the validity of licensee's calculation. In the BEPU method, it is very important to quantify the code and model uncertainty. It is referred in the following requirement: BE calculations in Regulatory Guide 1.157 - 'the code and models used are acceptable and applicable to the specific facility over the intended operating range and must quantify the uncertainty in the specific application'. In general, the uncertainty of model/code should be obtained through the data comparison with relevant integral- and separate-effect tests at different scales. However, it is not easy to determine these kinds of uncertainty because of the difficulty for evaluating accurately various experiments. Therefore, the expert judgment has been used in many cases even with the limitation that the uncertainty range of important parameters can be wide and inaccurate. In the KINS-REM, six heat transfer parameters in the blowdown phase have been used to consider the uncertainty of models. Recently, MARS-KS code was modified to consider the uncertainty of the five heat transfer parameters in the reflood phase. Accordingly, it is required that the uncertainty range for parameters of reflood models is determined and the effect of these ranges is evaluated. In this study, the large break LOCA (LBLOCA) analysis for OPR1000 was performed to identify the effect of uncertainty parameters in blowdown and reflood models.

  12. Assessing the strength of soil aggregates produced by two types of organic matter amendments using the ultrasonic energy

    Science.gov (United States)

    Zhu, Zhaolong; minasny, Budiman; Field, Damien; Angers, Denis

    2017-04-01

    The presence of organic matter (OM) is known to stimulate the formation of soil aggregates, but the aggregation strength may vary with different amount and type/quality of OM. Conventionally wet sieving method was used to assess the aggregates' strength. In this study, we wish to get insight of the effects of different types of C inputs on aggregate dynamics using quantifiable energy via ultrasonic agitation. A clay soil with an inherently low soil organic carbon (SOC) content, was amended with two different sources of organic matter (alfalfa, C:N = 16.7 and barley straw, C:N = 95.6) at different input levels (0, 10, 20, & 30 g C kg-1 soil). The soil's inherent macro aggregates were first destroyed via puddling. The soils were incubated in pots at moisture content 70% of field capacity for a period of 3 months. The pots were housed in a 1.2L sealed opaque plastic container. The CO2 generated during the incubation was captured by a vial of NaOH which was placed in each of the sealed containers and sampled per week. At 14, 28, 56, and 84 days, soil samples were collected and the change in aggregation was assessed using a combination of wet sieving and ultrasonic agitation. The relative strength of aggregates exposed to ultrasonic agitation was modelled using the aggregate disruption characteristic curve (ADCC) and soil dispersion characteristic curve (SDCC). Both residue quality and quantity of organic matter input influenced the amount of aggregates formed and their relative strength. The MWD of soils amended with alfalfa residues was greater than that of barley straw at lower input rates and early in the incubation. In the longer term, the use of ultrasonic energy revealed that barley straw resulted in stronger aggregates, especially at higher input rates despite showing similar MWD as alfalfa. The use of ultrasonic agitation, where we quantify the energy required to liberate and disperse aggregates allowed us to differentiate the effects of C inputs on the size of

  13. OR14-V-Uncertainty-PD2La Uncertainty Quantification for Nuclear Safeguards and Nondestructive Assay Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Nicholson, Andrew D. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Croft, Stephen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); McElroy, Robert Dennis [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    The various methods of nondestructive assay (NDA) of special nuclear material (SNM) have applications in nuclear nonproliferation, including detection and identification of illicit SNM at border crossings and quantifying SNM at nuclear facilities for safeguards. No assay method is complete without “error bars,” which provide one way of expressing confidence in the assay result. Consequently, NDA specialists typically provide error bars and also partition total uncertainty into “random” and “systematic” components so that, for example, an error bar can be developed for the total mass estimate in multiple items. Uncertainty Quantification (UQ) for NDA has always been important, but it is recognized that greater rigor is needed and achievable using modern statistical methods.

  14. Value assignment and uncertainty evaluation for single-element reference solutions

    Science.gov (United States)

    Possolo, Antonio; Bodnar, Olha; Butler, Therese A.; Molloy, John L.; Winchester, Michael R.

    2018-06-01

    A Bayesian statistical procedure is proposed for value assignment and uncertainty evaluation for the mass fraction of the elemental analytes in single-element solutions distributed as NIST standard reference materials. The principal novelty that we describe is the use of information about relative differences observed historically between the measured values obtained via gravimetry and via high-performance inductively coupled plasma optical emission spectrometry, to quantify the uncertainty component attributable to between-method differences. This information is encapsulated in a prior probability distribution for the between-method uncertainty component, and it is then used, together with the information provided by current measurement data, to produce a probability distribution for the value of the measurand from which an estimate and evaluation of uncertainty are extracted using established statistical procedures.

  15. Synchronization as Aggregation: Cluster Kinetics of Pulse-Coupled Oscillators.

    Science.gov (United States)

    O'Keeffe, Kevin P; Krapivsky, P L; Strogatz, Steven H

    2015-08-07

    We consider models of identical pulse-coupled oscillators with global interactions. Previous work showed that under certain conditions such systems always end up in sync, but did not quantify how small clusters of synchronized oscillators progressively coalesce into larger ones. Using tools from the study of aggregation phenomena, we obtain exact results for the time-dependent distribution of cluster sizes as the system evolves from disorder to synchrony.

  16. Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.

    2008-01-01

    The main objective of safety analysis is to demonstrate in a robust way that all safety requirements are met, i.e. sufficient margins exist between real values of important parameters and their threshold values at which damage of the barriers against release of radioactivity would occur. As stated in the IAEA Safety Requirements for Design of NPPs 'a safety analysis of the plant design shall be conducted in which methods of both deterministic and probabilistic analysis shall be applied'. It is required that 'the computer programs, analytical methods and plant models used in the safety analysis shall be verified and validated, and adequate consideration shall be given to uncertainties'. Uncertainties are present in calculations due to the computer codes, initial and boundary conditions, plant state, fuel parameters, scaling and numerical solution algorithm. All conservative approaches, still widely used, were introduced to cover uncertainties due to limited capability for modelling and understanding of physical phenomena at the early stages of safety analysis. The results obtained by this approach are quite unrealistic and the level of conservatism is not fully known. Another approach is the use of Best Estimate (BE) codes with realistic initial and boundary conditions. If this approach is selected, it should be based on statistically combined uncertainties for plant initial and boundary conditions, assumptions and code models. The current trends are going into direction of the best estimate code with some conservative assumptions of the system with realistic input data with uncertainty analysis. The BE analysis with evaluation of uncertainties offers, in addition, a way to quantify the existing plant safety margins. Its broader use in the future is therefore envisaged, even though it is not always feasible because of the difficulty of quantifying code uncertainties with sufficiently narrow range for every phenomenon and for each accident sequence. In this paper

  17. An introductory guide to uncertainty analysis in environmental and health risk assessment. Environmental Restoration Program

    International Nuclear Information System (INIS)

    Hammonds, J.S.; Hoffman, F.O.; Bartell, S.M.

    1994-12-01

    This report presents guidelines for evaluating uncertainty in mathematical equations and computer models applied to assess human health and environmental risk. Uncertainty analyses involve the propagation of uncertainty in model parameters and model structure to obtain confidence statements for the estimate of risk and identify the model components of dominant importance. Uncertainty analyses are required when there is no a priori knowledge about uncertainty in the risk estimate and when there is a chance that the failure to assess uncertainty may affect the selection of wrong options for risk reduction. Uncertainty analyses are effective when they are conducted in an iterative mode. When the uncertainty in the risk estimate is intolerable for decision-making, additional data are acquired for the dominant model components that contribute most to uncertainty. This process is repeated until the level of residual uncertainty can be tolerated. A analytical and numerical methods for error propagation are presented along with methods for identifying the most important contributors to uncertainty. Monte Carlo simulation with either Simple Random Sampling (SRS) or Latin Hypercube Sampling (LHS) is proposed as the most robust method for propagating uncertainty through either simple or complex models. A distinction is made between simulating a stochastically varying assessment endpoint (i.e., the distribution of individual risks in an exposed population) and quantifying uncertainty due to lack of knowledge about a fixed but unknown quantity (e.g., a specific individual, the maximally exposed individual, or the mean, median, or 95%-tile of the distribution of exposed individuals). Emphasis is placed on the need for subjective judgement to quantify uncertainty when relevant data are absent or incomplete

  18. Economic uncertainty and its impact on the Croatian economy

    Directory of Open Access Journals (Sweden)

    Petar Soric

    2017-12-01

    Full Text Available The aim of this paper is to quantify institutional (political and fiscal and non-institutional uncertainty (economic policy uncertainty, Economists’ recession index, natural disasters-related uncertainty, and several disagreement measures. The stated indicators are based on articles from highly popular Croatian news portals, the repository of law amendments (Narodne novine, and Business and Consumer Surveys. We also introduce a composite uncertainty indicator, obtained by the principal components method. The analysis of a structural VAR model of the Croatian economy (both with fixed and time-varying parameters has showed that a vast part of the analysed indicators are significant predictors of economic activity. It is demonstrated that their impact on industrial production is the strongest in the onset of a crisis. On the other hand, the influence of fiscal uncertainty exhibits just the opposite tendencies. It strengthens with the intensification of economic activity, which partially exculpates the possible utilization of fiscal expansion as a counter-crisis tool.

  19. The Uncertainty Principle in the Presence of Quantum Memory

    Science.gov (United States)

    Renes, Joseph M.; Berta, Mario; Christandl, Matthias; Colbeck, Roger; Renner, Renato

    2010-03-01

    One consequence of Heisenberg's uncertainty principle is that no observer can predict the outcomes of two incompatible measurements performed on a system to arbitrary precision. However, this implication is invalid if the the observer possesses a quantum memory, a distinct possibility in light of recent technological advances. Entanglement between the system and the memory is responsible for the breakdown of the uncertainty principle, as illustrated by the EPR paradox. In this work we present an improved uncertainty principle which takes this entanglement into account. By quantifying uncertainty using entropy, we show that the sum of the entropies associated with incompatible measurements must exceed a quantity which depends on the degree of incompatibility and the amount of entanglement between system and memory. Apart from its foundational significance, the uncertainty principle motivated the first proposals for quantum cryptography, though the possibility of an eavesdropper having a quantum memory rules out using the original version to argue that these proposals are secure. The uncertainty relation introduced here alleviates this problem and paves the way for its widespread use in quantum cryptography.

  20. Constructing carbon offsets: The obstacles to quantifying emission reductions

    International Nuclear Information System (INIS)

    Millard-Ball, Adam; Ortolano, Leonard

    2010-01-01

    The existing literature generally ascribes the virtual absence of the transport sector from the Clean Development Mechanism (CDM) to the inherent complexity of quantifying emission reductions from mobile sources. We use archival analysis and interviews with CDM decision-makers and experts to identify two additional groups of explanations. First, we show the significance of aspects of the CDM's historical evolution, such as the order in which methodologies were considered and the assignment of expert desk reviewers. Second, we highlight inconsistencies in the treatment of uncertainty across sectors. In contrast to transport methodologies, other sectors are characterized by a narrow focus on sources of measurement uncertainty and a neglect of economic effects ('market leakages'). We do not argue that the rejection of transport methodologies was unjustified, but rather than many of the same problems are inherent in other sectors. Thus, the case of transport sheds light on fundamental problems in quantifying emission reductions under the CDM. We argue that a key theoretical attraction of the CDM-equalization of marginal abatement costs across all sectors-has been difficult to achieve in practice.

  1. Fly ash as a binder in aggregate base courses

    International Nuclear Information System (INIS)

    Zenieris, P.; Laguros, J.G.

    1988-01-01

    The benefit of adding up to 35 wt% Class C high calcium fly ash to various types of fine and coarse aggregate pavement mixes is described and quantified. The mixes, which were compacted to maximum dry density at optimum moisture content, had variable compressive strengths during the first 28 day of curing; after that they assumed a relatively uniform pattern of strength gain reaching values as high as 11 MPa (1600 psi). Mixes containing 15% fly ash gave unacceptably low strengths. XRD measurements indicated massive formation of ettringite, transforming to monosulfoaluminate and the poorly crystallized hydrated phases of C-A-H, C-A-S-H and C-S-H. This transformation helps explain the gain in strength of the mixes with extended curing. SEM observations depicted progressive packing and densification of the skeletal matrix as the hexagonal phases and C-S-H gained higher crystallinity and formed aggregate masses. Furthermore, these observations suggest that fly ash acts predominantly as a chemical binder and partly as a filler in the aggregate mixes tested

  2. Modelling and propagation of uncertainties in the German Risk Study

    International Nuclear Information System (INIS)

    Hofer, E.; Krzykacz, B.

    1982-01-01

    Risk assessments are generally subject to uncertainty considerations. This is because of the various estimates that are involved. The paper points out those estimates in the so-called phase A of the German Risk Study, for which uncertainties were quantified. It explains the probabilistic models applied in the assessment to their impact on the findings of the study. Finally the resulting subjective confidence intervals of the study results are presented and their sensitivity to these probabilistic models is investigated

  3. Uncertainty analysis of thermal quantities measurement in a centrifugal compressor

    Science.gov (United States)

    Hurda, Lukáš; Matas, Richard

    2017-09-01

    Compressor performance characteristics evaluation process based on the measurement of pressure, temperature and other quantities is examined to find uncertainties for directly measured and derived quantities. CFD is used as a tool to quantify the influences of different sources of uncertainty of measurements for single- and multi-thermocouple total temperature probes. The heat conduction through the body of the thermocouple probe and the heat-up of the air in the intake piping are the main phenomena of interest.

  4. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  5. Uncertainty visualization in HARDI based on ensembles of ODFs

    KAUST Repository

    Jiao, Fangxiang; Phillips, Jeff M.; Gur, Yaniv; Johnson, Chris R.

    2012-01-01

    In this paper, we propose a new and accurate technique for uncertainty analysis and uncertainty visualization based on fiber orientation distribution function (ODF) glyphs, associated with high angular resolution diffusion imaging (HARDI). Our visualization applies volume rendering techniques to an ensemble of 3D ODF glyphs, which we call SIP functions of diffusion shapes, to capture their variability due to underlying uncertainty. This rendering elucidates the complex heteroscedastic structural variation in these shapes. Furthermore, we quantify the extent of this variation by measuring the fraction of the volume of these shapes, which is consistent across all noise levels, the certain volume ratio. Our uncertainty analysis and visualization framework is then applied to synthetic data, as well as to HARDI human-brain data, to study the impact of various image acquisition parameters and background noise levels on the diffusion shapes. © 2012 IEEE.

  6. Uncertainty visualization in HARDI based on ensembles of ODFs

    KAUST Repository

    Jiao, Fangxiang

    2012-02-01

    In this paper, we propose a new and accurate technique for uncertainty analysis and uncertainty visualization based on fiber orientation distribution function (ODF) glyphs, associated with high angular resolution diffusion imaging (HARDI). Our visualization applies volume rendering techniques to an ensemble of 3D ODF glyphs, which we call SIP functions of diffusion shapes, to capture their variability due to underlying uncertainty. This rendering elucidates the complex heteroscedastic structural variation in these shapes. Furthermore, we quantify the extent of this variation by measuring the fraction of the volume of these shapes, which is consistent across all noise levels, the certain volume ratio. Our uncertainty analysis and visualization framework is then applied to synthetic data, as well as to HARDI human-brain data, to study the impact of various image acquisition parameters and background noise levels on the diffusion shapes. © 2012 IEEE.

  7. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    Energy Technology Data Exchange (ETDEWEB)

    Han, Tae Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered.

  8. Uncertainty Analysis with Considering Resonance Self-shielding Effect

    International Nuclear Information System (INIS)

    Han, Tae Young

    2016-01-01

    If infinitely diluted multi-group cross sections were used for the sensitivity, the covariance data from the evaluated nuclear data library (ENDL) was directly applied. However, in case of using a self-shielded multi-group cross section, the covariance data should be corrected considering self-shielding effect. Usually, implicit uncertainty can be defined as the uncertainty change by the resonance self-shielding effect as described above. MUSAD ( Modules of Uncertainty and Sensitivity Analysis for DeCART ) has been developed for a multiplication factor and cross section uncertainty based on the generalized perturbation theory and it, however, can only quantify the explicit uncertainty by the self-shielded multi-group cross sections without considering the implicit effect. Thus, this paper addresses the implementation of the implicit uncertainty analysis module into the code and the numerical results for the verification are provided. The implicit uncertainty analysis module has been implemented into MUSAD based on infinitely-diluted cross section-based consistent method. The verification calculation was performed on MHTGR 350 Ex.I-1a and the differences with McCARD result decrease from 40% to 1% in CZP case and 3% in HFP case. From this study, it is expected that MUSAD code can reasonably produce the complete uncertainty on VHTR or LWR where the resonance self-shielding effect should be significantly considered

  9. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    International Nuclear Information System (INIS)

    Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project

  10. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  11. Temporal dynamics for soil aggregates determined using X-ray CT scanning

    DEFF Research Database (Denmark)

    Garbout, Amin; Munkholm, Lars Juhl; Hansen, Søren Baarsgaard

    2013-01-01

    Soil structure plays a key role in the ability of soil to fulfil essential soil functions and services in relation to e.g. root growth, gas and water transport and organic matter turnover. However, soils are not a very easy object to study as they are highly complex and opaque to the human eye...... aggregate properties such as volume, surface area and sphericity based on 3D images. We tested the methods on aggregates from different treatments and quantified changes over time. A total of 32 collections of aggregates, enclosed in mesocosms, were incubated in soil to follow the structural changes over....... Traditionally, they have been studied using invasive or destructive techniques. The advantage of using X-ray computed tomography (CT) in soil morphology is that it enables non-destructive quantification of soil structure in three dimensions (3D). The prime objective of the present study was to characterize soil...

  12. Separating the contributions of variability and parameter uncertainty in probability distributions

    International Nuclear Information System (INIS)

    Sankararaman, S.; Mahadevan, S.

    2013-01-01

    This paper proposes a computational methodology to quantify the individual contributions of variability and distribution parameter uncertainty to the overall uncertainty in a random variable. Even if the distribution type is assumed to be known, sparse or imprecise data leads to uncertainty about the distribution parameters. If uncertain distribution parameters are represented using probability distributions, then the random variable can be represented using a family of probability distributions. The family of distributions concept has been used to obtain qualitative, graphical inference of the contributions of natural variability and distribution parameter uncertainty. The proposed methodology provides quantitative estimates of the contributions of the two types of uncertainty. Using variance-based global sensitivity analysis, the contributions of variability and distribution parameter uncertainty to the overall uncertainty are computed. The proposed method is developed at two different levels; first, at the level of a variable whose distribution parameters are uncertain, and second, at the level of a model output whose inputs have uncertain distribution parameters

  13. Wind energy: Overcoming inadequate wind and modeling uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kane, Vivek

    2010-09-15

    'Green Energy' is the call of the day, and significance of Wind Energy can never be overemphasized. But the key question here is - What if the wind resources are inadequate? Studies reveal that the probability of finding favorable wind at a given place on land is only 15%. Moreover, there are inherent uncertainties associated with wind business. Can we overcome inadequate wind resources? Can we scientifically quantify uncertainty and model it to make business sense? This paper proposes a solution, by way of break-through Wind Technologies, combined with advanced tools for Financial Modeling, enabling vital business decisions.

  14. Status of uncertainty assessment in k0-NAA measurement. Anything still missing?

    International Nuclear Information System (INIS)

    Borut Smodis; Tinkara Bucar

    2014-01-01

    Several approaches to quantifying measurement uncertainty in k 0 -based neutron activation analysis (k 0 -NAA) are reviewed, comprising the original approach, the spreadsheet approach, the dedicated computer program involving analytical calculations and the two k 0 -NAA programs available on the market. Two imperfectness in the dedicated programs are identified, their impact assessed and possible improvements presented for a concrete experimental situation. The status of uncertainty assessment in k 0 -NAA is discussed and steps for improvement are recommended. It is concluded that the present magnitude of measurement uncertainty should further be improved by making additional efforts in reducing uncertainties of the relevant nuclear constants used. (author)

  15. Quantifying uncertainties in radar forward models through a comparison between CloudSat and SPartICus reflectivity factors

    Science.gov (United States)

    Mascio, Jeana; Mace, Gerald G.

    2017-02-01

    Interpretations of remote sensing measurements collected in sample volumes containing ice-phase hydrometeors are very sensitive to assumptions regarding the distributions of mass with ice crystal dimension, otherwise known as mass-dimensional or m-D relationships. How these microphysical characteristics vary in nature is highly uncertain, resulting in significant uncertainty in algorithms that attempt to derive bulk microphysical properties from remote sensing measurements. This uncertainty extends to radar reflectivity factors forward calculated from model output because the statistics of the actual m-D in nature is not known. To investigate the variability in m-D relationships in cirrus clouds, reflectivity factors measured by CloudSat are combined with particle size distributions (PSDs) collected by coincident in situ aircraft by using an optimal estimation-based (OE) retrieval of the m-D power law. The PSDs were collected by 12 flights of the Stratton Park Engineering Company Learjet during the Small Particles in Cirrus campaign. We find that no specific habit emerges as preferred, and instead, we find that the microphysical characteristics of ice crystal populations tend to be distributed over a continuum-defying simple categorization. With the uncertainties derived from the OE algorithm, the uncertainties in forward-modeled backscatter cross section and, in turn, radar reflectivity is calculated by using a bootstrapping technique, allowing us to infer the uncertainties in forward-modeled radar reflectivity that would be appropriately applied to remote sensing simulator algorithms.

  16. Development and application of objective uncertainty measures for nuclear power plant transient analysis

    International Nuclear Information System (INIS)

    Vinai, P.

    2007-10-01

    an important role. In all three cases, it has been shown that a more detailed, realistic and accurate representation of output uncertainty can he achieved with the proposed methodology, than is possible based on an 'expert-opinion' approach. Moreover, the importance of state space partitioning has been clearly brought out, by comparing results with those obtained assuming a single pdf for the entire database. The analysis of the Omega integral test has demonstrated that the drift-flux model's uncertainty remains important even while introducing other representative uncertainties. The developed methodology well retains its advantageous features during consideration of different uncertainty sources. The Peach Bottom turbine trip study represents a valuable demonstration of the applicability of the developed methodology to NPP transient analysis. In this application, the novel density estimator was also employed for estimating the pdf that underlies the uncertainty of the maximum power during the transient. The results obtained have been found to provide more detailed insights than can be gained from the 'classical' approach. Another feature of the turbine trip analysis has been a qualitative study of the impact of possible neutronics cross-section uncertainties on the power calculation. Besides the important influence of the uncertainty in void fraction predictions on the accuracy of the coupled transient's simulation, the uncertainties in neutronics parameters and models can be crucial as well. This points at the need for quantifying uncertainties in neutronics calculations and to aggregate them with those assessed for the thermal-hydraulic phenomena for the simulation of such multi-physics transients

  17. Quantifying soil carbon loss and uncertainty from a peatland wildfire using multi-temporal LiDAR

    Science.gov (United States)

    Reddy, Ashwan D.; Hawbaker, Todd J.; Wurster, F.; Zhu, Zhiliang; Ward, S.; Newcomb, Doug; Murray, R.

    2015-01-01

    Peatlands are a major reservoir of global soil carbon, yet account for just 3% of global land cover. Human impacts like draining can hinder the ability of peatlands to sequester carbon and expose their soils to fire under dry conditions. Estimating soil carbon loss from peat fires can be challenging due to uncertainty about pre-fire surface elevations. This study uses multi-temporal LiDAR to obtain pre- and post-fire elevations and estimate soil carbon loss caused by the 2011 Lateral West fire in the Great Dismal Swamp National Wildlife Refuge, VA, USA. We also determine how LiDAR elevation error affects uncertainty in our carbon loss estimate by randomly perturbing the LiDAR point elevations and recalculating elevation change and carbon loss, iterating this process 1000 times. We calculated a total loss using LiDAR of 1.10 Tg C across the 25 km2 burned area. The fire burned an average of 47 cm deep, equivalent to 44 kg C/m2, a value larger than the 1997 Indonesian peat fires (29 kg C/m2). Carbon loss via the First-Order Fire Effects Model (FOFEM) was estimated to be 0.06 Tg C. Propagating the LiDAR elevation error to the carbon loss estimates, we calculated a standard deviation of 0.00009 Tg C, equivalent to 0.008% of total carbon loss. We conclude that LiDAR elevation error is not a significant contributor to uncertainty in soil carbon loss under severe fire conditions with substantial peat consumption. However, uncertainties may be more substantial when soil elevation loss is of a similar or smaller magnitude than the reported LiDAR error.

  18. Uncertainties in criticality analysis which affect the storage and transportation of LWR fuel

    International Nuclear Information System (INIS)

    Napolitani, D.G.

    1989-01-01

    Satisfying the design criteria for subcriticality with uncertainties affects: the capacity of LWR storage arrays, maximum allowable enrichment, minimum allowable burnup and economics of various storage options. There are uncertainties due to: calculational method, data libraries, geometric limitations, modelling bias, the number and quality of benchmarks performed and mechanical uncertainties in the array. Yankee Atomic Electric Co. (YAEC) has developed and benchmarked methods to handle: high density storage rack designs, pin consolidation, low density moderation and burnup credit. The uncertainties associated with such criticality analysis are quantified on the basis of clean criticals, power reactor criticals and intercomparison of independent analysis methods

  19. A statistical approach to quantify uncertainty in carbon monoxide measurements at the Izaña global GAW station: 2008-2011

    Science.gov (United States)

    Gomez-Pelaez, A. J.; Ramos, R.; Gomez-Trueba, V.; Novelli, P. C.; Campo-Hernandez, R.

    2013-03-01

    Atmospheric CO in situ measurements are carried out at the Izaña (Tenerife) global GAW (Global Atmosphere Watch Programme of the World Meteorological Organization - WMO) mountain station using a Reduction Gas Analyser (RGA). In situ measurements at Izaña are representative of the subtropical Northeast Atlantic free troposphere, especially during nighttime. We present the measurement system configuration, the response function, the calibration scheme, the data processing, the Izaña 2008-2011 CO nocturnal time series, and the mean diurnal cycle by months. We have developed a rigorous uncertainty analysis for carbon monoxide measurements carried out at the Izaña station, which could be applied to other GAW stations. We determine the combined standard measurement uncertainty taking into consideration four contributing components: uncertainty of the WMO standard gases interpolated over the range of measurement, the uncertainty that takes into account the agreement between the standard gases and the response function used, the uncertainty due to the repeatability of the injections, and the propagated uncertainty related to the temporal consistency of the response function parameters (which also takes into account the covariance between the parameters). The mean value of the combined standard uncertainty decreased significantly after March 2009, from 2.37 nmol mol-1 to 1.66 nmol mol-1, due to improvements in the measurement system. A fifth type of uncertainty we call representation uncertainty is considered when some of the data necessary to compute the temporal mean are absent. Any computed mean has also a propagated uncertainty arising from the uncertainties of the data used to compute the mean. The law of propagation depends on the type of uncertainty component (random or systematic). In situ hourly means are compared with simultaneous and collocated NOAA flask samples. The uncertainty of the differences is computed and used to determine whether the differences are

  20. Uncertainties in risk assessment and decision making

    International Nuclear Information System (INIS)

    Starzec, Peter; Purucker, Tom; Stewart, Robert

    2008-02-01

    The general concept for risk assessment in accordance with the Swedish model for contaminated soil implies that the toxicological reference value for a given receptor is first back-calculated to a corresponding concentration of a compound in soil and (if applicable) then modified with respect to e.g. background levels, acute toxicity, and factor of safety. This result in a guideline value that is subsequently compared to the observed concentration levels. Many sources of uncertainty exist when assessing whether the risk for a receptor is significant or not. In this study, the uncertainty aspects have been addressed from three standpoints: 1. Uncertainty in the comparison between the level of contamination (source) and a given risk criterion (e.g. a guideline value) and possible implications on subsequent decisions. This type of uncertainty is considered to be most important in situations where a contaminant is expected to be spatially heterogeneous without any tendency to form isolated clusters (hotspots) that can be easily delineated, i.e. where mean values are appropriate to compare to the risk criterion. 2. Uncertainty in spatial distribution of a contaminant. Spatial uncertainty should be accounted for when hotspots are to be delineated and the volume of soil contaminated with levels above a stated decision criterion has to be assessed (quantified). 3. Uncertainty in an ecological exposure model with regard to the moving pattern of a receptor in relation to spatial distribution of contaminant in question. The study points out that the choice of methodology to characterize the relation between contaminant concentration and a pre-defined risk criterion is governed by a conceptual perception of the contaminant's spatial distribution and also depends on the structure of collected data (observations). How uncertainty in transition from contaminant concentration into risk criterion can be quantified was demonstrated by applying hypothesis tests and the concept of

  1. Soil aggregation and slope stability related to soil density, root length, and mycorrhiza

    Science.gov (United States)

    Graf, Frank; Frei, Martin

    2013-04-01

    similar correlations, i.e. that ?' of low density soil material (~15.5 kN/m³) increased by the same amount whether by planting with White Alder or by compaction to ~19.0 kN/m³. Based on this coincidence the method to quantify soil aggregate produced satisfying results which indicate that soil aggregate stability is a potential proxy for ?' and the joint impact of mycorrhizal fungi and plant roots increase the resistance against superficial soil failure. It is concluded that soil aggregate stability mirrors biological effects on soil stability reasonably well and may be used as an indicator to quantify the effectiveness of ecological restoration and stabilisation measures.

  2. Uncertainty Quantification in High Throughput Screening ...

    Science.gov (United States)

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  3. Uncertainty quantification of CO2 emission reduction for maritime shipping

    International Nuclear Information System (INIS)

    Yuan, Jun; Ng, Szu Hui; Sou, Weng Sut

    2016-01-01

    The International Maritime Organization (IMO) has recently proposed several operational and technical measures to improve shipping efficiency and reduce the greenhouse gases (GHG) emissions. The abatement potentials estimated for these measures have been further used by many organizations to project future GHG emission reductions and plot Marginal Abatement Cost Curves (MACC). However, the abatement potentials estimated for many of these measures can be highly uncertain as many of these measures are new, with limited sea trial information. Furthermore, the abatements obtained are highly dependent on ocean conditions, trading routes and sailing patterns. When the estimated abatement potentials are used for projections, these ‘input’ uncertainties are often not clearly displayed or accounted for, which can lead to overly optimistic or pessimistic outlooks. In this paper, we propose a methodology to systematically quantify and account for these input uncertainties on the overall abatement potential forecasts. We further propose improvements to MACCs to better reflect the uncertainties in marginal abatement costs and total emissions. This approach provides a fuller and more accurate picture of abatement forecasts and potential reductions achievable, and will be useful to policy makers and decision makers in the shipping industry to better assess the cost effective measures for CO 2 emission reduction. - Highlights: • We propose a systematic method to quantify uncertainty in emission reduction. • Marginal abatement cost curves are improved to better reflect the uncertainties. • Percentage reduction probability is given to determine emission reduction target. • The methodology is applied to a case study on maritime shipping.

  4. Label-free DNA quantification via a 'pipette, aggregate and blot' (PAB) approach with magnetic silica particles on filter paper.

    Science.gov (United States)

    Li, Jingyi; Liu, Qian; Alsamarri, Hussein; Lounsbury, Jenny A; Haversitick, Doris M; Landers, James P

    2013-03-07

    Reliable measurement of DNA concentration is essential for a broad range of applications in biology and molecular biology, and for many of these, quantifying the nucleic acid content is inextricably linked to obtaining optimal results. In its most simplistic form, quantitative analysis of nucleic acids can be accomplished by UV-Vis absorbance and, in more sophisticated format, by fluorimetry. A recently reported new concept, the 'pinwheel assay', involves a label-free approach for quantifying DNA through aggregation of paramagnetic beads in a rotating magnetic field. Here, we describe a simplified version of that assay adapted for execution using only a pipet and filter paper. The 'pipette, aggregate, and blot' (PAB) approach allows DNA to induce bead aggregation in a pipette tip through exposure to a magnetic field, followed by dispensing (blotting) onto filter paper. The filter paper immortalises the extent of aggregation, and digital images of the immortalized bead conformation, acquired with either a document scanner or a cell phone camera, allows for DNA quantification using a noncomplex algorithm. Human genomic DNA samples extracted from blood are quantified with the PAB approach and the results utilized to define the volume of sample used in a PCR reaction that is sensitive to input mass of template DNA. Integrating the PAB assay with paper-based DNA extraction and detection modalities has the potential to yield 'DNA quant-on-paper' devices that may be useful for point-of-care testing.

  5. Automation of aggregate characterization using laser profiling and digital image analysis

    Science.gov (United States)

    Kim, Hyoungkwan

    2002-08-01

    Particle morphological properties such as size, shape, angularity, and texture are key properties that are frequently used to characterize aggregates. The characteristics of aggregates are crucial to the strength, durability, and serviceability of the structure in which they are used. Thus, it is important to select aggregates that have proper characteristics for each specific application. Use of improper aggregate can cause rapid deterioration or even failure of the structure. The current standard aggregate test methods are generally labor-intensive, time-consuming, and subject to human errors. Moreover, important properties of aggregates may not be captured by the standard methods due to a lack of an objective way of quantifying critical aggregate properties. Increased quality expectations of products along with recent technological advances in information technology are motivating new developments to provide fast and accurate aggregate characterization. The resulting information can enable a real time quality control of aggregate production as well as lead to better design and construction methods of portland cement concrete and hot mix asphalt. This dissertation presents a system to measure various morphological characteristics of construction aggregates effectively. Automatic measurement of various particle properties is of great interest because it has the potential to solve such problems in manual measurements as subjectivity, labor intensity, and slow speed. The main efforts of this research are placed on three-dimensional (3D) laser profiling, particle segmentation algorithms, particle measurement algorithms, and generalized particle descriptors. First, true 3D data of aggregate particles obtained by laser profiling are transformed into digital images. Second, a segmentation algorithm and a particle measurement algorithm are developed to separate particles and process each particle data individually with the aid of various kinds of digital image

  6. The Method of Manufactured Universes for validating uncertainty quantification methods

    KAUST Repository

    Stripling, H.F.

    2011-09-01

    The Method of Manufactured Universes is presented as a validation framework for uncertainty quantification (UQ) methodologies and as a tool for exploring the effects of statistical and modeling assumptions embedded in these methods. The framework calls for a manufactured reality from which experimental data are created (possibly with experimental error), an imperfect model (with uncertain inputs) from which simulation results are created (possibly with numerical error), the application of a system for quantifying uncertainties in model predictions, and an assessment of how accurately those uncertainties are quantified. The application presented in this paper manufactures a particle-transport universe, models it using diffusion theory with uncertain material parameters, and applies both Gaussian process and Bayesian MARS algorithms to make quantitative predictions about new experiments within the manufactured reality. The results of this preliminary study indicate that, even in a simple problem, the improper application of a specific UQ method or unrealized effects of a modeling assumption may produce inaccurate predictions. We conclude that the validation framework presented in this paper is a powerful and flexible tool for the investigation and understanding of UQ methodologies. © 2011 Elsevier Ltd. All rights reserved.

  7. Factoring uncertainty into restoration modeling of in-situ leach uranium mines

    Science.gov (United States)

    Johnson, Raymond H.; Friedel, Michael J.

    2009-01-01

    Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.

  8. A method for minimum risk portfolio optimization under hybrid uncertainty

    Science.gov (United States)

    Egorova, Yu E.; Yazenin, A. V.

    2018-03-01

    In this paper, we investigate a minimum risk portfolio model under hybrid uncertainty when the profitability of financial assets is described by fuzzy random variables. According to Feng, the variance of a portfolio is defined as a crisp value. To aggregate fuzzy information the weakest (drastic) t-norm is used. We construct an equivalent stochastic problem of the minimum risk portfolio model and specify the stochastic penalty method for solving it.

  9. Integrating uncertainties for climate change mitigation

    Science.gov (United States)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  10. Quantifying reactor safety margins: Application of CSAU [Code Scalability, Applicability and Uncertainty] methodology to LBLOCA: Part 3, Assessment and ranging of parameters for the uncertainty analysis of LBLOCA codes

    International Nuclear Information System (INIS)

    Wulff, W.; Boyack, B.E.; Duffey, R.B.

    1988-01-01

    Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs

  11. Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory

    2009-01-01

    The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which can be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.

  12. Calculating Remote Sensing Reflectance Uncertainties Using an Instrument Model Propagated Through Atmospheric Correction via Monte Carlo Simulations

    Science.gov (United States)

    Karakoylu, E.; Franz, B.

    2016-01-01

    First attempt at quantifying uncertainties in ocean remote sensing reflectance satellite measurements. Based on 1000 iterations of Monte Carlo. Data source is a SeaWiFS 4-day composite, 2003. The uncertainty is for remote sensing reflectance (Rrs) at 443 nm.

  13. Role of turbulence fluctuations on uncertainties of acoutic Doppler current profiler discharge measurements

    Science.gov (United States)

    Tarrab, Leticia; Garcia, Carlos M.; Cantero, Mariano I.; Oberg, Kevin

    2012-01-01

    This work presents a systematic analysis quantifying the role of the presence of turbulence fluctuations on uncertainties (random errors) of acoustic Doppler current profiler (ADCP) discharge measurements from moving platforms. Data sets of three-dimensional flow velocities with high temporal and spatial resolution were generated from direct numerical simulation (DNS) of turbulent open channel flow. Dimensionless functions relating parameters quantifying the uncertainty in discharge measurements due to flow turbulence (relative variance and relative maximum random error) to sampling configuration were developed from the DNS simulations and then validated with field-scale discharge measurements. The validated functions were used to evaluate the role of the presence of flow turbulence fluctuations on uncertainties in ADCP discharge measurements. The results of this work indicate that random errors due to the flow turbulence are significant when: (a) a low number of transects is used for a discharge measurement, and (b) measurements are made in shallow rivers using high boat velocity (short time for the boat to cross a flow turbulence structure).

  14. The effect of uncertainty and aggregate investments on crude oil price dynamics

    International Nuclear Information System (INIS)

    Tvedt, Jostein

    2002-01-01

    This paper is a study of the dynamics of the oil industry and we derive a mean reverting process for the crude oil price. Oil is supplied by a market leader, OPEC, and by an aggregate that represents non-OPEC producers. The non-OPEC producers take the oil price as given. The cost of non-OPEC producers depends on past investments. Shifts in these investments are influenced by costs of structural change in the construction industry. A drop in the oil price to below a given level triggers lower investments, but if the oil price reverts back to a high level investments may not immediately expand. In an uncertain oil demand environment cost of structural change creates a value of waiting to invest. This investment behaviour influences the oil price process

  15. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Laboratory; Sisterson, DL [Argonne National Laboratory

    2015-10-01

    The Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily-accessible, well-articulated estimate of ARM measurement uncertainty is needed.

  16. Investigation of laboratory test procedures for assessing the structural capacity of geogrid-reinforced aggregate base materials.

    Science.gov (United States)

    2015-04-01

    The objective of this research was to identify a laboratory test method that can be used to quantify improvements in structural capacity of aggregate base materials reinforced with geogrid. For this research, National Cooperative Highway Research Pro...

  17. Characterisation of a reference site for quantifying uncertainties related to soil sampling

    International Nuclear Information System (INIS)

    Barbizzi, Sabrina; Zorzi, Paolo de; Belli, Maria; Pati, Alessandra; Sansone, Umberto; Stellato, Luisa; Barbina, Maria; Deluisa, Andrea; Menegon, Sandro; Coletti, Valter

    2004-01-01

    An integrated approach to quality assurance in soil sampling remains to be accomplished. - The paper reports a methodology adopted to face problems related to quality assurance in soil sampling. The SOILSAMP project, funded by the Environmental Protection Agency of Italy (APAT), is aimed at (i) establishing protocols for soil sampling in different environments; (ii) assessing uncertainties associated with different soil sampling methods in order to select the 'fit-for-purpose' method; (iii) qualifying, in term of trace elements spatial variability, a reference site for national and international inter-comparison exercises. Preliminary results and considerations are illustrated

  18. Estimation of uncertainty in pKa values determined by potentiometric titration.

    Science.gov (United States)

    Koort, Eve; Herodes, Koit; Pihl, Viljar; Leito, Ivo

    2004-06-01

    A procedure is presented for estimation of uncertainty in measurement of the pK(a) of a weak acid by potentiometric titration. The procedure is based on the ISO GUM. The core of the procedure is a mathematical model that involves 40 input parameters. A novel approach is used for taking into account the purity of the acid, the impurities are not treated as inert compounds only, their possible acidic dissociation is also taken into account. Application to an example of practical pK(a) determination is presented. Altogether 67 different sources of uncertainty are identified and quantified within the example. The relative importance of different uncertainty sources is discussed. The most important source of uncertainty (with the experimental set-up of the example) is the uncertainty of pH measurement followed by the accuracy of the burette and the uncertainty of weighing. The procedure gives uncertainty separately for each point of the titration curve. The uncertainty depends on the amount of titrant added, being lowest in the central part of the titration curve. The possibilities of reducing the uncertainty and interpreting the drift of the pK(a) values obtained from the same curve are discussed.

  19. Optimal Decision Making Framework of an Electric Vehicle Aggregator in Future and Pool markets

    DEFF Research Database (Denmark)

    Rashidizadeh-Kermani, Homa; Najafi, Hamid Reza; Anvari-Moghaddam, Amjad

    2018-01-01

    An electric vehicle (EV) aggregator, as an agent between power producers and EV owners, participates in the future and pool market to supply EVs’ requirement. Because of uncertain nature of pool prices and EVs’ behavior, this paper proposed a two stage scenario-based model to obtain optimal decis...... electricity markets, a sensitivity analysis over risk factor is performed. The numerical results demonstrate that with the application of the proposed model, the aggregator can supply EVs with lower purchases from markets....... decision making of an EV aggregator. To deal with mentioned uncertainties, the aggregator’s risk aversion is applied using conditional value at risk (CVaR) method in the proposed model. The proposed two stage risk-constrained decision making problem is applied to maximize EV aggregator’s expected profit...... in an uncertain environment. The aggregator can participate in the future and pool market to buy required energy of EVs and offer optimal charge/discharge prices to the EV owners. In this model, in order to assess the effects of EVs owners’ reaction to the aggregator’s offered prices on the purchases from...

  20. Dealing with uncertainties in the context of post mining hazard evaluation

    OpenAIRE

    Cauvin , Maxime; Salmon , Romuald; Verdel , Thierry

    2008-01-01

    International audience; Risk analyses related to a past mining activity are generally performed in a strong context in uncertainties. A PhD Thesis has been undertaken in 2004 in order to draw up solutions to take into account these uncertainties in the practice. The possibility of elaborating a more quantified evaluation of risk has also been discussed, and in particular the contribution that probabilistic methods may brought to an analysis. This paper summarizes the main results of the Thesi...

  1. Large contribution of natural aerosols to uncertainty in indirect forcing

    Science.gov (United States)

    Carslaw, K. S.; Lee, L. A.; Reddington, C. L.; Pringle, K. J.; Rap, A.; Forster, P. M.; Mann, G. W.; Spracklen, D. V.; Woodhouse, M. T.; Regayre, L. A.; Pierce, J. R.

    2013-11-01

    The effect of anthropogenic aerosols on cloud droplet concentrations and radiative properties is the source of one of the largest uncertainties in the radiative forcing of climate over the industrial period. This uncertainty affects our ability to estimate how sensitive the climate is to greenhouse gas emissions. Here we perform a sensitivity analysis on a global model to quantify the uncertainty in cloud radiative forcing over the industrial period caused by uncertainties in aerosol emissions and processes. Our results show that 45 per cent of the variance of aerosol forcing since about 1750 arises from uncertainties in natural emissions of volcanic sulphur dioxide, marine dimethylsulphide, biogenic volatile organic carbon, biomass burning and sea spray. Only 34 per cent of the variance is associated with anthropogenic emissions. The results point to the importance of understanding pristine pre-industrial-like environments, with natural aerosols only, and suggest that improved measurements and evaluation of simulated aerosols in polluted present-day conditions will not necessarily result in commensurate reductions in the uncertainty of forcing estimates.

  2. Large contribution of natural aerosols to uncertainty in indirect forcing.

    Science.gov (United States)

    Carslaw, K S; Lee, L A; Reddington, C L; Pringle, K J; Rap, A; Forster, P M; Mann, G W; Spracklen, D V; Woodhouse, M T; Regayre, L A; Pierce, J R

    2013-11-07

    The effect of anthropogenic aerosols on cloud droplet concentrations and radiative properties is the source of one of the largest uncertainties in the radiative forcing of climate over the industrial period. This uncertainty affects our ability to estimate how sensitive the climate is to greenhouse gas emissions. Here we perform a sensitivity analysis on a global model to quantify the uncertainty in cloud radiative forcing over the industrial period caused by uncertainties in aerosol emissions and processes. Our results show that 45 per cent of the variance of aerosol forcing since about 1750 arises from uncertainties in natural emissions of volcanic sulphur dioxide, marine dimethylsulphide, biogenic volatile organic carbon, biomass burning and sea spray. Only 34 per cent of the variance is associated with anthropogenic emissions. The results point to the importance of understanding pristine pre-industrial-like environments, with natural aerosols only, and suggest that improved measurements and evaluation of simulated aerosols in polluted present-day conditions will not necessarily result in commensurate reductions in the uncertainty of forcing estimates.

  3. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    Science.gov (United States)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    . This complex model then serves as the basis to compare simpler model structures. Through this approach, predictive uncertainty can be quantified relative to a known reference solution.

  4. Economic impacts of climate change: Methods of estimating impacts at an aggregate level using Hordaland as an illustration

    International Nuclear Information System (INIS)

    Aaheim, Asbjoern

    2003-01-01

    This report discusses methods for calculating economic impacts of climate change, and uses Hordaland county in Norway as an illustrative example. The calculations are based on estimated climate changes from the RegClim project. This study draws from knowledge of the relationship between economic activity and climate at a disaggregate level and calculates changes in production of and demand for goods and services within aggregate sectors, which are specified in the county budget for Hordaland. Total impacts for the county thus are expressed through known values from the national budget, such as the county's ''national product'', total consumption, and investments. The estimates of impacts of climate changes at a disaggregate level in Hordaland are quantified only to small degree. The calculations made in this report can thus only be considered appropriate for illustrating methods and interpretations. In terms of relative economic significance for the county, however, it is likely that the hydropower sector will be the most affected. Increased precipitation will result in greater production potential, but profitability will largely depend on projected energy prices and investment costs associated with expansion. Agriculture and forestry will increase their production potential, but they are relatively small sectors in the county. Compared with the uncertainty about how climate change will affect production, however, the uncertainty about changes in demand is far greater. The demand for personal transportation and construction in particular can have significant consequences for the county's economy. (author)

  5. Visualizing the uncertainty in the relationship between seasonal average climate and malaria risk.

    Science.gov (United States)

    MacLeod, D A; Morse, A P

    2014-12-02

    Around $1.6 billion per year is spent financing anti-malaria initiatives, and though malaria morbidity is falling, the impact of annual epidemics remains significant. Whilst malaria risk may increase with climate change, projections are highly uncertain and to sidestep this intractable uncertainty, adaptation efforts should improve societal ability to anticipate and mitigate individual events. Anticipation of climate-related events is made possible by seasonal climate forecasting, from which warnings of anomalous seasonal average temperature and rainfall, months in advance are possible. Seasonal climate hindcasts have been used to drive climate-based models for malaria, showing significant skill for observed malaria incidence. However, the relationship between seasonal average climate and malaria risk remains unquantified. Here we explore this relationship, using a dynamic weather-driven malaria model. We also quantify key uncertainty in the malaria model, by introducing variability in one of the first order uncertainties in model formulation. Results are visualized as location-specific impact surfaces: easily integrated with ensemble seasonal climate forecasts, and intuitively communicating quantified uncertainty. Methods are demonstrated for two epidemic regions, and are not limited to malaria modeling; the visualization method could be applied to any climate impact.

  6. Model structures amplify uncertainty in predicted soil carbon responses to climate change.

    Science.gov (United States)

    Shi, Zheng; Crowell, Sean; Luo, Yiqi; Moore, Berrien

    2018-06-04

    Large model uncertainty in projected future soil carbon (C) dynamics has been well documented. However, our understanding of the sources of this uncertainty is limited. Here we quantify the uncertainties arising from model parameters, structures and their interactions, and how those uncertainties propagate through different models to projections of future soil carbon stocks. Both the vertically resolved model and the microbial explicit model project much greater uncertainties to climate change than the conventional soil C model, with both positive and negative C-climate feedbacks, whereas the conventional model consistently predicts positive soil C-climate feedback. Our findings suggest that diverse model structures are necessary to increase confidence in soil C projection. However, the larger uncertainty in the complex models also suggests that we need to strike a balance between model complexity and the need to include diverse model structures in order to forecast soil C dynamics with high confidence and low uncertainty.

  7. A probabilistic approach to quantify the uncertainties in internal dose assessment using response surface and neural network

    International Nuclear Information System (INIS)

    Baek, M.; Lee, S.K.; Lee, U.C.; Kang, C.S.

    1996-01-01

    A probabilistic approach is formulated to assess the internal radiation exposure following the intake of radioisotopes. This probabilistic approach consists of 4 steps as follows: (1) screening, (2) quantification of uncertainties, (3) propagation of uncertainties, and (4) analysis of output. The approach has been applied for Pu-induced internal dose assessment and a multi-compartment dosimetric model is used for internal transport. In this approach, surrogate models of original system are constructed using response and neural network. And the results of these surrogate models are compared with those of original model. Each surrogate model well approximates the original model. The uncertainty and sensitivity analysis of the model parameters are evaluated in this process. Dominant contributors to each organ are identified and the results show that this approach could serve a good tool of assessing the internal radiation exposure

  8. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  9. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    International Nuclear Information System (INIS)

    Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G

  10. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  11. Axial power monitoring uncertainty in the Savannah River Reactors

    International Nuclear Information System (INIS)

    Losey, D.C.; Revolinski, S.M.

    1990-01-01

    The results of this analysis quantified the uncertainty associated with monitoring the Axial Power Shape (APS) in the Savannah River Reactors. Thermocouples at each assembly flow exit map the radial power distribution and are the primary means of monitoring power in these reactors. The remaining uncertainty in power monitoring is associated with the relative axial power distribution. The APS is monitored by seven sensors that respond to power on each of nine vertical Axial Power Monitor (APM) rods. Computation of the APS uncertainty, for the reactor power limits analysis, started with a large database of APM rod measurements spanning several years of reactor operation. A computer algorithm was used to randomly select a sample of APSs which were input to a code. This code modeled the thermal-hydraulic performance of a single fuel assembly during a design basis Loss-of Coolant Accident. The assembly power limit at Onset of Significant Voiding was computed for each APS. The output was a distribution of expected assembly power limits that was adjusted to account for the biases caused by instrumentation error and by measuring 7 points rather than a continuous APS. Statistical analysis of the final assembly power limit distribution showed that reducing reactor power by approximately 3% was sufficient to account for APS variation. This data confirmed expectations that the assembly exit thermocouples provide all information needed for monitoring core power. The computational analysis results also quantified the contribution to power limits of the various uncertainties such as instrumentation error

  12. Groundwater fluxes in a shallow seasonal wetland pond: The effect of bathymetric uncertainty on predicted water and solute balances

    Science.gov (United States)

    Trigg, Mark A.; Cook, Peter G.; Brunner, Philip

    2014-09-01

    The successful management of groundwater dependent shallow seasonal wetlands requires a sound understanding of groundwater fluxes. However, such fluxes are hard to quantify. Water volume and solute mass balance models can be used in order to derive an estimate of groundwater fluxes within such systems. This approach is particularly attractive, as it can be undertaken using measurable environmental variables, such as; rainfall, evaporation, pond level and salinity. Groundwater fluxes estimated from such an approach are subject to uncertainty in the measured variables as well as in the process representation and in parameters within the model. However, the shallow nature of seasonal wetland ponds means water volume and surface area can change rapidly and non-linearly with depth, requiring an accurate representation of the wetland pond bathymetry. Unfortunately, detailed bathymetry is rarely available and simplifying assumptions regarding the bathymetry have to be made. However, the implications of these assumptions are typically not quantified. We systematically quantify the uncertainty implications for eight different representations of wetland bathymetry for a shallow seasonal wetland pond in South Australia. The predictive uncertainty estimation methods provided in the Model-Independent Parameter Estimation and Uncertainty Analysis software (PEST) are used to quantify the effect of bathymetric uncertainty on the modelled fluxes. We demonstrate that bathymetry can be successfully represented within the model in a simple parametric form using a cubic Bézier curve, allowing an assessment of bathymetric uncertainty due to measurement error and survey detail on the derived groundwater fluxes compared with the fixed bathymetry models. Findings show that different bathymetry conceptualisations can result in very different mass balance components and hence process conceptualisations, despite equally good fits to observed data, potentially leading to poor management

  13. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    Science.gov (United States)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  14. Rydberg aggregates

    Science.gov (United States)

    Wüster, S.; Rost, J.-M.

    2018-02-01

    We review Rydberg aggregates, assemblies of a few Rydberg atoms exhibiting energy transport through collective eigenstates, considering isolated atoms or assemblies embedded within clouds of cold ground-state atoms. We classify Rydberg aggregates, and provide an overview of their possible applications as quantum simulators for phenomena from chemical or biological physics. Our main focus is on flexible Rydberg aggregates, in which atomic motion is an essential feature. In these, simultaneous control over Rydberg-Rydberg interactions, external trapping and electronic energies, allows Born-Oppenheimer surfaces for the motion of the entire aggregate to be tailored as desired. This is illustrated with theory proposals towards the demonstration of joint motion and excitation transport, conical intersections and non-adiabatic effects. Additional flexibility for quantum simulations is enabled by the use of dressed dipole-dipole interactions or the embedding of the aggregate in a cold gas or Bose-Einstein condensate environment. Finally we provide some guidance regarding the parameter regimes that are most suitable for the realization of either static or flexible Rydberg aggregates based on Li or Rb atoms. The current status of experimental progress towards enabling Rydberg aggregates is also reviewed.

  15. CHARACTERIZATION OF BIOGENIC, INTERMEDIATE AND PHYSICOGENIC SOIL AGGREGATES OF AREAS IN THE BRAZILIAN ATLANTIC FOREST

    Directory of Open Access Journals (Sweden)

    JÚLIO CÉSAR FEITOSA FERNANDES

    2017-01-01

    Full Text Available Aggregate formation and stability are related to soil quality, contributing significantly to the carbon storage and nutrient maintenance capacities of the soil. Soil aggregates are formed by two different process: physicogenic, related to moistening and drying cycles and input of organic matter; and biogenic, related to the action of macrofauna organisms and roots. The objective this work was to classify aggregates according to their formation process, quantify and compare organic carbon contents in humic substances and assess the stability of aggregates formed by different processes, in areas with different coverage in the Mid Paraiba Valley, Pinheiral, State of Rio de Janeiro, Brazil. Aggregated soil samples were collected at a depth of 0-10 cm, in a Cambisol (Cambissolo Háplico Tb Distrófico under four plant covers: secondary forest in advanced (SFAS, medium (SFMS and initial (SFIS successional stages and managed mixed pasture (MMP. Aggregates were classified and identified into three morphological classes (physicogenic, biogenic and intermediate. The variables evaluated were mean weight diameter (MWD and geometric mean diameter (GMD of aggregates, chemical fractions of organic matter, total organic carbon (TOC and humic substances: humin (C-HUM humic acid (C-FAH and fulvic acid (C-FAF. Biogenic aggregates were found in smaller quantities and showed higher TOC, C-HUM and C-FAH, compared to intermediate and physicogenic aggregates. Thus, biogenic aggregates have potential to be used as soil quality indicators for structured environments, which are able to maintain its intrinsic formation processes.

  16. Uncertainty quantification in lattice QCD calculations for nuclear physics

    Energy Technology Data Exchange (ETDEWEB)

    Beane, Silas R. [Univ. of Washington, Seattle, WA (United States); Detmold, William [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Orginos, Kostas [College of William and Mary, Williamsburg, VA (United States); Thomas Jefferson National Accelerator Facility (TJNAF), Newport News, VA (United States); Savage, Martin J. [Institute for Nuclear Theory, Seattle, WA (United States)

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  17. Bayesian Hierarchical Structure for Quantifying Population Variability to Inform Probabilistic Health Risk Assessments.

    Science.gov (United States)

    Shao, Kan; Allen, Bruce C; Wheeler, Matthew W

    2017-10-01

    Human variability is a very important factor considered in human health risk assessment for protecting sensitive populations from chemical exposure. Traditionally, to account for this variability, an interhuman uncertainty factor is applied to lower the exposure limit. However, using a fixed uncertainty factor rather than probabilistically accounting for human variability can hardly support probabilistic risk assessment advocated by a number of researchers; new methods are needed to probabilistically quantify human population variability. We propose a Bayesian hierarchical model to quantify variability among different populations. This approach jointly characterizes the distribution of risk at background exposure and the sensitivity of response to exposure, which are commonly represented by model parameters. We demonstrate, through both an application to real data and a simulation study, that using the proposed hierarchical structure adequately characterizes variability across different populations. © 2016 Society for Risk Analysis.

  18. Information theoretic quantification of diagnostic uncertainty.

    Science.gov (United States)

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  19. Quantifying uncertainties in the estimation of safety parameters by using bootstrapped artificial neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Secchi, Piercesare [MOX, Department of Mathematics, Polytechnic of Milan (Italy); Zio, Enrico [Department of Energy, Polytechnic of Milan, Via Ponzio 34/3, 20133 Milano (Italy)], E-mail: enrico.zio@polimi.it; Di Maio, Francesco [Department of Energy, Polytechnic of Milan, Via Ponzio 34/3, 20133 Milano (Italy)

    2008-12-15

    For licensing purposes, safety cases of Nuclear Power Plants (NPPs) must be presented at the Regulatory Authority with the necessary confidence on the models used to describe the plant safety behavior. In principle, this requires the repetition of a large number of model runs to account for the uncertainties inherent in the model description of the true plant behavior. The present paper propounds the use of bootstrapped Artificial Neural Networks (ANNs) for performing the numerous model output calculations needed for estimating safety margins with appropriate confidence intervals. Account is given both to the uncertainties inherent in the plant model and to those introduced by the ANN regression models used for performing the repeated safety parameter evaluations. The proposed framework of analysis is first illustrated with reference to a simple analytical model and then to the estimation of the safety margin on the maximum fuel cladding temperature reached during a complete group distribution header blockage scenario in a RBMK-1500 nuclear reactor. The results are compared with those obtained by a traditional parametric approach.

  20. Uncertainty in the availability of natural resources: Fossil fuels, critical metals and biomass

    International Nuclear Information System (INIS)

    Speirs, Jamie; McGlade, Christophe; Slade, Raphael

    2015-01-01

    Energy policies are strongly influenced by resource availability and recoverability estimates. Yet these estimates are often highly uncertain, frequently incommensurable, and regularly contested. This paper explores how the uncertainties surrounding estimates of the availability of fossil fuels, biomass and critical metals are conceptualised and communicated. The contention is that a better understanding of the uncertainties surrounding resource estimates for both conventional and renewable energy resources can contribute to more effective policy decision making in the long term. Two complementary approaches for framing uncertainty are considered in detail: a descriptive typology of uncertainties and a framework that conceptualises uncertainty as alternative states of incomplete knowledge. Both have the potential to be useful analytical and communication tools. For the three resource types considered here we find that data limitations, inconsistent definitions and the use of incommensurable methodologies present a pervasive problem that impedes comparison. Many aspects of resource uncertainty are also not commonly captured in the conventional resource classification schemes. This highlights the need for considerable care when developing and comparing aggregate resource estimates and when using these to inform strategic energy policy decisions. - Highlights: • Resource estimates are highly uncertain, frequently incommensurable, and regularly contested. • Data limitations need to be overcome, and methodologies harmonised and improved. • Sustainability and socio-political uncertainties are frequently neglected. • Uncertainties are dynamic, but reducing uncertainties inevitably involves trade-offs.

  1. Uncertainty in river discharge observations: a quantitative analysis

    Directory of Open Access Journals (Sweden)

    G. Di Baldassarre

    2009-06-01

    Full Text Available This study proposes a framework for analysing and quantifying the uncertainty of river flow data. Such uncertainty is often considered to be negligible with respect to other approximations affecting hydrological studies. Actually, given that river discharge data are usually obtained by means of the so-called rating curve method, a number of different sources of error affect the derived observations. These include: errors in measurements of river stage and discharge utilised to parameterise the rating curve, interpolation and extrapolation error of the rating curve, presence of unsteady flow conditions, and seasonal variations of the state of the vegetation (i.e. roughness. This study aims at analysing these sources of uncertainty using an original methodology. The novelty of the proposed framework lies in the estimation of rating curve uncertainty, which is based on hydraulic simulations. These latter are carried out on a reach of the Po River (Italy by means of a one-dimensional (1-D hydraulic model code (HEC-RAS. The results of the study show that errors in river flow data are indeed far from negligible.

  2. Uncertainty Propagation in Monte Carlo Depletion Analysis

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Kim, Yeong-il; Park, Ho Jin; Joo, Han Gyu; Kim, Chang Hyo

    2008-01-01

    A new formulation aimed at quantifying uncertainties of Monte Carlo (MC) tallies such as k eff and the microscopic reaction rates of nuclides and nuclide number densities in MC depletion analysis and examining their propagation behaviour as a function of depletion time step (DTS) is presented. It is shown that the variance of a given MC tally used as a measure of its uncertainty in this formulation arises from four sources; the statistical uncertainty of the MC tally, uncertainties of microscopic cross sections and nuclide number densities, and the cross correlations between them and the contribution of the latter three sources can be determined by computing the correlation coefficients between the uncertain variables. It is also shown that the variance of any given nuclide number density at the end of each DTS stems from uncertainties of the nuclide number densities (NND) and microscopic reaction rates (MRR) of nuclides at the beginning of each DTS and they are determined by computing correlation coefficients between these two uncertain variables. To test the viability of the formulation, we conducted MC depletion analysis for two sample depletion problems involving a simplified 7x7 fuel assembly (FA) and a 17x17 PWR FA, determined number densities of uranium and plutonium isotopes and their variances as well as k ∞ and its variance as a function of DTS, and demonstrated the applicability of the new formulation for uncertainty propagation analysis that need be followed in MC depletion computations. (authors)

  3. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  4. Uncertainty in Analyzed Water and Energy Budgets at Continental Scales

    Science.gov (United States)

    Bosilovich, Michael G.; Robertson, F. R.; Mocko, D.; Chen, J.

    2011-01-01

    Operational analyses and retrospective-analyses provide all the physical terms of mater and energy budgets, guided by the assimilation of atmospheric observations. However, there is significant reliance on the numerical models, and so, uncertainty in the budget terms is always present. Here, we use a recently developed data set consisting of a mix of 10 analyses (both operational and retrospective) to quantify the uncertainty of analyzed water and energy budget terms for GEWEX continental-scale regions, following the evaluation of Dr. John Roads using individual reanalyses data sets.

  5. Climate induced changes on the hydrology of Mediterranean basins - assessing uncertainties and quantifying risks

    Science.gov (United States)

    Ludwig, Ralf

    2014-05-01

    According to current climate projections, the Mediterranean area is at high risk for severe changes in the hydrological budget and extremes. With innovative scientific measures, integrated hydrological modeling and novel field geophysical field monitoring techniques, the FP7 project CLIMB (Climate Induced Changes on the Hydrology of Mediterranean Basins; GA: 244151) assessed the impacts of climate change on the hydrology in seven basins in the Mediterranean area, in Italy, France, Turkey, Tunisia, Egypt and the Gaza Strip, and quantified uncertainties and risks for the main stakeholders of each test site. Intensive climate model auditing selected four regional climate models, whose data was bias corrected and downscaled to serve as climate forcing for a set of hydrological models in each site. The results of the multi-model hydro-climatic ensemble and socio-economic factor analysis were applied to develop a risk model building upon spatial vulnerability and risk assessment. Findings generally reveal an increasing risk for water resources management in the test sites, yet at different rates and severity in the investigated sectors, with highest impacts likely to occur in the transition months. Most important elements of this research include the following aspects: • Climate change contributes, yet in strong regional variation, to water scarcity in the Mediterranean; other factors, e.g. pollution or poor management practices, are regionally still dominant pressures on water resources. • Rain-fed agriculture needs to adapt to seasonal changes; stable or increasing productivity likely depends on additional irrigation. • Tourism could benefit in shoulder seasons, but may expect income losses in the summer peak season due to increasing heat stress. • Local & regional water managers and water users, lack, as yet, awareness of climate change induced risks; emerging focus areas are supplies of domestic drinking water, irrigation, hydropower and livestock. • Data

  6. Forecasting Uncertainty in Electricity Smart Meter Data by Boosting Additive Quantile Regression

    KAUST Repository

    Taieb, Souhaib Ben; Huser, Raphaë l; Hyndman, Rob J.; Genton, Marc G.

    2016-01-01

    volatile and less predictable. There is a need within the energy industry for probabilistic forecasts of household electricity consumption to quantify the uncertainty of future electricity demand in order to undertake appropriate planning of generation

  7. Remarks on the assessment, representation, aggregation and utilization of expert opinion

    International Nuclear Information System (INIS)

    Fine, T.L.

    1980-04-01

    This report considers the relevance of recent ideas in the foundations of probability to the rational use of expert opinion in the design of a nuclear waste repository, and the assessment of its performance. The main probability concepts introduce are those of modal (probably A), comparative (A is at least as probable as B) and interval-valued (the lower probability of A is P(A) and the upper probability of A is P(anti A)) probabilities. We then outline an approach first using comparative probability to model the resuls of binary elicitation of an expert's opinions concerning repository uncertainties and then employing interval-valued probability to represent comparative probability in a computationally convenient form. We further consider the issue of aggregating or amalgamating the responses of several experts, and we emphasize the need to preserve some measure of the disagreements among the experts. The resulting aggregated interval-valued representation of the responses concerning the uncertainties surrounding the performance of a nuclear waste repository design can then be used to numerically assess this performance in a manner parallel to that of utility theory. Utility theory is the basis for statistical decision theory. Our recommendations can only be tentative, and research is recommended to gain some working experience with the results of the proposed decision-making process in the repostory design context

  8. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    Science.gov (United States)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  9. Uncertainty propagation analysis of an N2O emission model at the plot and landscape scale

    NARCIS (Netherlands)

    Nol, L.; Heuvelink, G.B.M.; Veldkamp, A.; Vries, de W.; Kros, J.

    2010-01-01

    Nitrous oxide (N2O) emission from agricultural land is an important component of the total annual greenhouse gas (GHG) budget. In addition, uncertainties associated with agricultural N2O emissions are large. The goals of this work were (i) to quantify the uncertainties of modelled N2O emissions

  10. Mass discharge estimation from contaminated sites: Multi-model solutions for assessment of conceptual uncertainty

    Science.gov (United States)

    Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.

    2012-04-01

    Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from

  11. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    Science.gov (United States)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on 2 small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment method with 2 different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was a likelihood function for the flow quantiles directly. Due to the better data coverage and smaller hydrological complexity in one of our test catchments we had better performance from the hydrological model and thus could observe that the relative importance of different uncertainty sources varied between sites, boundary conditions and flow indicators. The uncertainty of future climate was important, but not dominant. The deficiencies of the hydrological model were on the same scale, especially for the sites and flow components where model performance for the past observations was further from optimal (Nash-Sutcliffe index = 0.5 - 0.7). The overall uncertainty of predictions was well beyond the expected change signal even for the best performing site and flow indicator.

  12. The role of spatial aggregation in forensic entomology.

    Science.gov (United States)

    Fiene, Justin G; Sword, Gregory A; Van Laerhoven, Sherah L; Tarone, Aaron M

    2014-01-01

    A central concept in forensic entomology is that arthropod succession on carrion is predictable and can be used to estimate the postmortem interval (PMI) of human remains. However, most studies have reported significant variation in successional patterns, particularly among replicate carcasses, which has complicated estimates of PMIs. Several forensic entomology researchers have proposed that further integration of ecological and evolutionary theory in forensic entomology could help advance the application of succession data for producing PMI estimates. The purpose of this essay is to draw attention to the role of spatial aggregation of arthropods among carrion resources as a potentially important aspect to consider for understanding and predicting the assembly of arthropods on carrion over time. We review ecological literature related to spatial aggregation of arthropods among patchy and ephemeral resources, such as carrion, and when possible integrate these results with published forensic literature. We show that spatial aggregation of arthropods across resources is commonly reported and has been used to provide fundamental insight for understanding regional and local patterns of arthropod diversity and coexistence. Moreover, two suggestions are made for conducting future research. First, because intraspecific aggregation affects species frequency distributions across carcasses, data from replicate carcasses should not be combined, but rather statistically quantified to generate occurrence probabilities. Second, we identify a need for studies that tease apart the degree to which community assembly on carrion is spatially versus temporally structured, which will aid in developing mechanistic hypotheses on the ecological factors shaping community assembly on carcasses.

  13. [Quantitative studies on reversible thrombocyte aggregation during exertion].

    Science.gov (United States)

    Haber, P; Silberbauer, K; Sinzinger, H

    1980-10-11

    In 8 oarsmen aged 19 to 31 years a symptom-limited rectangular-progressive bicycle stress test has been conducted. Venous blood was taken before and at the end of the test, and 30 and 60 minutes afterwards. pH, base excess, pCO2, platelet count and platelet count ratio (WU and HOAK) were measured or calculated, the last in order to quantify the tendency of the platelets to form reversible aggregates. At the point of exhaustion there is a highly significant (p cunt ratio (= increase in reversible platelet aggregates). A highly significant correlation exists between base excess and the platelet count ratio. The regression line does not fall below the normal value of the platelet count ratio until the delta-base excess is -4 mval/l. This means that an increase in the tendency to form reversible platelet aggregates is not typical of the range of aerobic metabolism but of muscular work in the anaerobic range with high exercise-induced metabolic acidosis. The basis for sudden death in sport due to internal reasons is not uncommonly an unknown and asymptomatic coronary disease and platelet aggregates. Persons aged over 30 years and sports in which competition is also inherent (soccer, tennis) are often involved. Acute cardiac death in sport is not very frequent. Nevertheless, the following recomendation seems to be warranted: persons aged over 30 years in bad condition should not start competitive sports or other intensive muscular exercise. Before they do so, low-intensive, controlled, aerobic endurance training is necessary.

  14. Robustness for slope stability modelling under deep uncertainty

    Science.gov (United States)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  15. The magnitude and causes of uncertainty in global model simulations of cloud condensation nuclei

    Directory of Open Access Journals (Sweden)

    L. A. Lee

    2013-09-01

    Full Text Available Aerosol–cloud interaction effects are a major source of uncertainty in climate models so it is important to quantify the sources of uncertainty and thereby direct research efforts. However, the computational expense of global aerosol models has prevented a full statistical analysis of their outputs. Here we perform a variance-based analysis of a global 3-D aerosol microphysics model to quantify the magnitude and leading causes of parametric uncertainty in model-estimated present-day concentrations of cloud condensation nuclei (CCN. Twenty-eight model parameters covering essentially all important aerosol processes, emissions and representation of aerosol size distributions were defined based on expert elicitation. An uncertainty analysis was then performed based on a Monte Carlo-type sampling of an emulator built for each model grid cell. The standard deviation around the mean CCN varies globally between about ±30% over some marine regions to ±40–100% over most land areas and high latitudes, implying that aerosol processes and emissions are likely to be a significant source of uncertainty in model simulations of aerosol–cloud effects on climate. Among the most important contributors to CCN uncertainty are the sizes of emitted primary particles, including carbonaceous combustion particles from wildfires, biomass burning and fossil fuel use, as well as sulfate particles formed on sub-grid scales. Emissions of carbonaceous combustion particles affect CCN uncertainty more than sulfur emissions. Aerosol emission-related parameters dominate the uncertainty close to sources, while uncertainty in aerosol microphysical processes becomes increasingly important in remote regions, being dominated by deposition and aerosol sulfate formation during cloud-processing. The results lead to several recommendations for research that would result in improved modelling of cloud–active aerosol on a global scale.

  16. Reducing the uncertainty in the fidelity of seismic imaging results

    Science.gov (United States)

    Zhou, H. W.; Zou, Z.

    2017-12-01

    A key aspect in geoscientific inversion is quantifying the quality of the results. In seismic imaging, we must quantify the uncertainty of every imaging result based on field data, because data noise and methodology limitations may produce artifacts. Detection of artifacts is therefore an important aspect in uncertainty quantification in geoscientific inversion. Quantifying the uncertainty of seismic imaging solutions means assessing their fidelity, which defines the truthfulness of the imaged targets in terms of their resolution, position error and artifact. Key challenges to achieving the fidelity of seismic imaging include: (1) Difficulty to tell signal from artifact and noise; (2) Limitations in signal-to-noise ratio and seismic illumination; and (3) The multi-scale nature of the data space and model space. Most seismic imaging studies of the Earth's crust and mantle have employed inversion or modeling approaches. Though they are in opposite directions of mapping between the data space and model space, both inversion and modeling seek the best model to minimize the misfit in the data space, which unfortunately is not the output space. The fact that the selection and uncertainty of the output model are not judged in the output space has exacerbated the nonuniqueness problem for inversion and modeling. In contrast, the practice in exploration seismology has long established a two-fold approach of seismic imaging: Using velocity modeling building to establish the long-wavelength reference velocity models, and using seismic migration to map the short-wavelength reflectivity structures. Most interestingly, seismic migration maps the data into an output space called imaging space, where the output reflection images of the subsurface are formed based on an imaging condition. A good example is the reverse time migration, which seeks the reflectivity image as the best fit in the image space between the extrapolation of time-reversed waveform data and the prediction

  17. Dissertation Defense Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional "validation by test only" mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions. Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in "Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations". This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics

  18. Dissertation Defense: Computational Fluid Dynamics Uncertainty Analysis for Payload Fairing Spacecraft Environmental Control Systems

    Science.gov (United States)

    Groves, Curtis Edward

    2014-01-01

    Spacecraft thermal protection systems are at risk of being damaged due to airflow produced from Environmental Control Systems. There are inherent uncertainties and errors associated with using Computational Fluid Dynamics to predict the airflow field around a spacecraft from the Environmental Control System. This paper describes an approach to quantify the uncertainty in using Computational Fluid Dynamics to predict airflow speeds around an encapsulated spacecraft without the use of test data. Quantifying the uncertainty in analytical predictions is imperative to the success of any simulation-based product. The method could provide an alternative to traditional validation by test only mentality. This method could be extended to other disciplines and has potential to provide uncertainty for any numerical simulation, thus lowering the cost of performing these verifications while increasing the confidence in those predictions.Spacecraft requirements can include a maximum airflow speed to protect delicate instruments during ground processing. Computational Fluid Dynamics can be used to verify these requirements; however, the model must be validated by test data. This research includes the following three objectives and methods. Objective one is develop, model, and perform a Computational Fluid Dynamics analysis of three (3) generic, non-proprietary, environmental control systems and spacecraft configurations. Several commercially available and open source solvers have the capability to model the turbulent, highly three-dimensional, incompressible flow regime. The proposed method uses FLUENT, STARCCM+, and OPENFOAM. Objective two is to perform an uncertainty analysis of the Computational Fluid Dynamics model using the methodology found in Comprehensive Approach to Verification and Validation of Computational Fluid Dynamics Simulations. This method requires three separate grids and solutions, which quantify the error bars around Computational Fluid Dynamics predictions

  19. The evaluation of meta-analysis techniques for quantifying prescribed fire effects on fuel loadings.

    Science.gov (United States)

    Karen E. Kopper; Donald McKenzie; David L. Peterson

    2009-01-01

    Models and effect-size metrics for meta-analysis were compared in four separate meta-analyses quantifying surface fuels after prescribed fires in ponderosa pine (Pinus ponderosa Dougl. ex Laws.) forests of the Western United States. An aggregated data set was compiled from eight published reports that contained data from 65 fire treatment units....

  20. A statistical approach to quantify uncertainty in carbon monoxide measurements at the Izaña global GAW station: 2008–2011

    Directory of Open Access Journals (Sweden)

    A. J. Gomez-Pelaez

    2013-03-01

    Full Text Available Atmospheric CO in situ measurements are carried out at the Izaña (Tenerife global GAW (Global Atmosphere Watch Programme of the World Meteorological Organization – WMO mountain station using a Reduction Gas Analyser (RGA. In situ measurements at Izaña are representative of the subtropical Northeast Atlantic free troposphere, especially during nighttime. We present the measurement system configuration, the response function, the calibration scheme, the data processing, the Izaña 2008–2011 CO nocturnal time series, and the mean diurnal cycle by months. We have developed a rigorous uncertainty analysis for carbon monoxide measurements carried out at the Izaña station, which could be applied to other GAW stations. We determine the combined standard measurement uncertainty taking into consideration four contributing components: uncertainty of the WMO standard gases interpolated over the range of measurement, the uncertainty that takes into account the agreement between the standard gases and the response function used, the uncertainty due to the repeatability of the injections, and the propagated uncertainty related to the temporal consistency of the response function parameters (which also takes into account the covariance between the parameters. The mean value of the combined standard uncertainty decreased significantly after March 2009, from 2.37 nmol mol−1 to 1.66 nmol mol−1, due to improvements in the measurement system. A fifth type of uncertainty we call representation uncertainty is considered when some of the data necessary to compute the temporal mean are absent. Any computed mean has also a propagated uncertainty arising from the uncertainties of the data used to compute the mean. The law of propagation depends on the type of uncertainty component (random or systematic. In situ hourly means are compared with simultaneous and collocated NOAA flask samples. The uncertainty of the differences is computed and used to determine

  1. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, P

    2007-10-15

    ) plant transient in which the void feedback mechanism plays an important role. In all three cases, it has been shown that a more detailed, realistic and accurate representation of output uncertainty can he achieved with the proposed methodology, than is possible based on an 'expert-opinion' approach. Moreover, the importance of state space partitioning has been clearly brought out, by comparing results with those obtained assuming a single pdf for the entire database. The analysis of the Omega integral test has demonstrated that the drift-flux model's uncertainty remains important even while introducing other representative uncertainties. The developed methodology well retains its advantageous features during consideration of different uncertainty sources. The Peach Bottom turbine trip study represents a valuable demonstration of the applicability of the developed methodology to NPP transient analysis. In this application, the novel density estimator was also employed for estimating the pdf that underlies the uncertainty of the maximum power during the transient. The results obtained have been found to provide more detailed insights than can be gained from the 'classical' approach. Another feature of the turbine trip analysis has been a qualitative study of the impact of possible neutronics cross-section uncertainties on the power calculation. Besides the important influence of the uncertainty in void fraction predictions on the accuracy of the coupled transient's simulation, the uncertainties in neutronics parameters and models can be crucial as well. This points at the need for quantifying uncertainties in neutronics calculations and to aggregate them with those assessed for the thermal-hydraulic phenomena for the simulation of such multi-physics transients.

  2. Efficiency of analytical and sampling-based uncertainty propagation in intensity-modulated proton therapy

    Science.gov (United States)

    Wahl, N.; Hennig, P.; Wieser, H. P.; Bangert, M.

    2017-07-01

    The sensitivity of intensity-modulated proton therapy (IMPT) treatment plans to uncertainties can be quantified and mitigated with robust/min-max and stochastic/probabilistic treatment analysis and optimization techniques. Those methods usually rely on sparse random, importance, or worst-case sampling. Inevitably, this imposes a trade-off between computational speed and accuracy of the uncertainty propagation. Here, we investigate analytical probabilistic modeling (APM) as an alternative for uncertainty propagation and minimization in IMPT that does not rely on scenario sampling. APM propagates probability distributions over range and setup uncertainties via a Gaussian pencil-beam approximation into moments of the probability distributions over the resulting dose in closed form. It supports arbitrary correlation models and allows for efficient incorporation of fractionation effects regarding random and systematic errors. We evaluate the trade-off between run-time and accuracy of APM uncertainty computations on three patient datasets. Results are compared against reference computations facilitating importance and random sampling. Two approximation techniques to accelerate uncertainty propagation and minimization based on probabilistic treatment plan optimization are presented. Runtimes are measured on CPU and GPU platforms, dosimetric accuracy is quantified in comparison to a sampling-based benchmark (5000 random samples). APM accurately propagates range and setup uncertainties into dose uncertainties at competitive run-times (GPU ≤slant {5} min). The resulting standard deviation (expectation value) of dose show average global γ{3% / {3}~mm} pass rates between 94.2% and 99.9% (98.4% and 100.0%). All investigated importance sampling strategies provided less accuracy at higher run-times considering only a single fraction. Considering fractionation, APM uncertainty propagation and treatment plan optimization was proven to be possible at constant time complexity

  3. Uncertainty related to Environmental Data and Estimated Extreme Events

    DEFF Research Database (Denmark)

    Burcharth, H. F.

    The design loads on rubble mound breakwaters are almost entirely determined by the environmental conditions, i.e. sea state, water levels, sea bed characteristics, etc. It is the objective of sub-group B to identify the most important environmental parameters and evaluate the related uncertainties...... including those corresponding to extreme estimates typically used for design purposes. Basically a design condition is made up of a set of parameter values stemming from several environmental parameters. To be able to evaluate the uncertainty related to design states one must know the corresponding joint....... Consequently this report deals mainly with each parameter separately. Multi parameter problems are briefly discussed in section 9. It is important to notice that the quantified uncertainties reported in section 7.7 represent what might be regarded as typical figures to be used only when no more qualified...

  4. In pursuit of a fit-for-purpose uncertainty guide

    Science.gov (United States)

    White, D. R.

    2016-08-01

    Measurement uncertainty is a measure of the quality of a measurement; it enables users of measurements to manage the risks and costs associated with decisions influenced by measurements, and it supports metrological traceability by quantifying the proximity of measurement results to true SI values. The Guide to the Expression of Uncertainty in Measurement (GUM) ensures uncertainty statements meet these purposes and encourages the world-wide harmony of measurement uncertainty practice. Although the GUM is an extraordinarily successful document, it has flaws, and a revision has been proposed. Like the already-published supplements to the GUM, the proposed revision employs objective Bayesian statistics instead of frequentist statistics. This paper argues that the move away from a frequentist treatment of measurement error to a Bayesian treatment of states of knowledge is misguided. The move entails changes in measurement philosophy, a change in the meaning of probability, and a change in the object of uncertainty analysis, all leading to different numerical results, increased costs, increased confusion, a loss of trust, and, most significantly, a loss of harmony with current practice. Recommendations are given for a revision in harmony with the current GUM and allowing all forms of statistical inference.

  5. Risk Assessment Uncertainties in Cybersecurity Investments

    Directory of Open Access Journals (Sweden)

    Andrew Fielder

    2018-06-01

    Full Text Available When undertaking cybersecurity risk assessments, it is important to be able to assign numeric values to metrics to compute the final expected loss that represents the risk that an organization is exposed to due to cyber threats. Even if risk assessment is motivated by real-world observations and data, there is always a high chance of assigning inaccurate values due to different uncertainties involved (e.g., evolving threat landscape, human errors and the natural difficulty of quantifying risk. Existing models empower organizations to compute optimal cybersecurity strategies given their financial constraints, i.e., available cybersecurity budget. Further, a general game-theoretic model with uncertain payoffs (probability-distribution-valued payoffs shows that such uncertainty can be incorporated in the game-theoretic model by allowing payoffs to be random. This paper extends previous work in the field to tackle uncertainties in risk assessment that affect cybersecurity investments. The findings from simulated examples indicate that although uncertainties in cybersecurity risk assessment lead, on average, to different cybersecurity strategies, they do not play a significant role in the final expected loss of the organization when utilising a game-theoretic model and methodology to derive these strategies. The model determines robust defending strategies even when knowledge regarding risk assessment values is not accurate. As a result, it is possible to show that the cybersecurity investments’ tool is capable of providing effective decision support.

  6. Development of a Dynamic Lidar Uncertainty Framework

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, Andrew [WindForS; Bonin, Timothy [CIRES/NOAA ESRL; Choukulkar, Aditya [CIRES/NOAA ESRL; Brewer, W. Alan [NOAA ESRL; Delgado, Ruben [University of Maryland Baltimore County

    2017-08-07

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing consider uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict

  7. Non-parametric data-based approach for the quantification and communication of uncertainties in river flood forecasts

    Science.gov (United States)

    Van Steenbergen, N.; Willems, P.

    2012-04-01

    Reliable flood forecasts are the most important non-structural measures to reduce the impact of floods. However flood forecasting systems are subject to uncertainty originating from the input data, model structure and model parameters of the different hydraulic and hydrological submodels. To quantify this uncertainty a non-parametric data-based approach has been developed. This approach analyses the historical forecast residuals (differences between the predictions and the observations at river gauging stations) without using a predefined statistical error distribution. Because the residuals are correlated with the value of the forecasted water level and the lead time, the residuals are split up into discrete classes of simulated water levels and lead times. For each class, percentile values are calculated of the model residuals and stored in a 'three dimensional error' matrix. By 3D interpolation in this error matrix, the uncertainty in new forecasted water levels can be quantified. In addition to the quantification of the uncertainty, the communication of this uncertainty is equally important. The communication has to be done in a consistent way, reducing the chance of misinterpretation. Also, the communication needs to be adapted to the audience; the majority of the larger public is not interested in in-depth information on the uncertainty on the predicted water levels, but only is interested in information on the likelihood of exceedance of certain alarm levels. Water managers need more information, e.g. time dependent uncertainty information, because they rely on this information to undertake the appropriate flood mitigation action. There are various ways in presenting uncertainty information (numerical, linguistic, graphical, time (in)dependent, etc.) each with their advantages and disadvantages for a specific audience. A useful method to communicate uncertainty of flood forecasts is by probabilistic flood mapping. These maps give a representation of the

  8. Stakeholder attitudes towards cumulative and aggregate exposure assessment of pesticides.

    Science.gov (United States)

    Verbeke, Wim; Van Loo, Ellen J; Vanhonacker, Filiep; Delcour, Ilse; Spanoghe, Pieter; van Klaveren, Jacob D

    2015-05-01

    This study evaluates the attitudes and perspectives of different stakeholder groups (agricultural producers, pesticide manufacturers, trading companies, retailers, regulators, food safety authorities, scientists and NGOs) towards the concepts of cumulative and aggregate exposure assessment of pesticides by means of qualitative in-depth interviews (n = 15) and a quantitative stakeholder survey (n = 65). The stakeholders involved generally agreed that the use of chemical pesticides is needed, primarily for meeting the need of feeding the growing world population, while clearly acknowledging the problematic nature of human exposure to pesticide residues. Current monitoring was generally perceived to be adequate, but the timeliness and consistency of monitoring practices across countries were questioned. The concept of cumulative exposure assessment was better understood by stakeholders than the concept of aggregate exposure assessment. Identified pitfalls were data availability, data limitations, sources and ways of dealing with uncertainties, as well as information and training needs. Regulators and food safety authorities were perceived as the stakeholder groups for whom cumulative and aggregate pesticide exposure assessment methods and tools would be most useful and acceptable. Insights obtained from this exploratory study have been integrated in the development of targeted and stakeholder-tailored dissemination and training programmes that were implemented within the EU-FP7 project ACROPOLIS. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Probabilistic Electricity Price Forecasting Models by Aggregation of Competitive Predictors

    Directory of Open Access Journals (Sweden)

    Claudio Monteiro

    2018-04-01

    Full Text Available This article presents original probabilistic price forecasting meta-models (PPFMCP models, by aggregation of competitive predictors, for day-ahead hourly probabilistic price forecasting. The best twenty predictors of the EEM2016 EPF competition are used to create ensembles of hourly spot price forecasts. For each hour, the parameter values of the probability density function (PDF of a Beta distribution for the output variable (hourly price can be directly obtained from the expected and variance values associated to the ensemble for such hour, using three aggregation strategies of predictor forecasts corresponding to three PPFMCP models. A Reliability Indicator (RI and a Loss function Indicator (LI are also introduced to give a measure of uncertainty of probabilistic price forecasts. The three PPFMCP models were satisfactorily applied to the real-world case study of the Iberian Electricity Market (MIBEL. Results from PPFMCP models showed that PPFMCP model 2, which uses aggregation by weight values according to daily ranks of predictors, was the best probabilistic meta-model from a point of view of mean absolute errors, as well as of RI and LI. PPFMCP model 1, which uses the averaging of predictor forecasts, was the second best meta-model. PPFMCP models allow evaluations of risk decisions based on the price to be made.

  10. Parameters for assessing recycled aggregate and their correlation.

    Science.gov (United States)

    Tam, Vivian W Y; Tam, C M

    2009-02-01

    Construction and demolition (C&D) waste has consumed a large portion of the landfill areas in Hong Kong. Among them, concrete occupies more than 70% of the total C&D waste by volume. Thus it is necessary to recycle concrete waste to preserve landfill areas. Various governmental departments of the Hong Kong Special Administrative Region (HKSAR) are encouraging the use of recycled aggregate (RA) in the Hong Kong construction industry by issuing various guidelines and specifications. Owing to uncertainty in their properties, however, practitioners are sceptical in using it as a substitute. In this study, an attempt has been made to look at relations among six main parameters that describe the behaviour of RA: (1) particle size distribution; (2) particle density; (3) porosity and absorption; (4) particle shape; (5) strength and toughness; and (6) chloride and sulphate contents. RA samples were obtained from nine demolition sites with service lives ranging from 10 to 40 years and another set of samples was collected from the Tuen Mun Area 38 recycling plant. The behaviour of these samples was compared with that of normal aggregate samples. This study revealed that there is a strong correlation among various parameters, and by measuring three of them: either 'particle density' or 'porosity and absorption' or 'particle shape', and 'strength and toughness', and 'chloride and sulphate contents', it is possible to assess the behaviour of RA. This can significantly help by reducing RA testing time and cost before using it as recycled aggregate concrete.

  11. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens

    2015-11-26

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving the history matching process. The framework can consist of a geological model that is interfaced with a reservoir simulator. The reservoir simulator can interface with seismic, electromagnetic, gravimetric and surface deformation modules to predict the corresponding observations. The observations can then be incorporated into a recursive filter that subsequently updates the model state and parameters distributions, providing a general framework to quantify and eventually reduce with the data, uncertainty in the estimated reservoir state and parameters.

  12. Using Statistical Downscaling to Quantify the GCM-Related Uncertainty in Regional Climate Change Scenarios: A Case Study of Swedish Precipitation

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    There are a number of sources of uncertainty in regional climate change scenarios. When statistical downscaling is used to obtain regional climate change scenarios, the uncertainty may originate from the uncertainties in the global climate models used, the skill of the statistical model, and the forcing scenarios applied to the global climate model. The uncertainty associated with global climate models can be evaluated by examining the differences in the predictors and in the downscaled climate change scenarios based on a set of different global climate models. When standardized global climate model simulations such as the second phase of the Coupled Model Intercomparison Project (CMIP2) are used, the difference in the downscaled variables mainly reflects differences in the climate models and the natural variability in the simulated climates. It is proposed that the spread of the estimates can be taken as a measure of the uncertainty associated with global climate models. The proposed method is applied to the estimation of global-climate-model-related uncertainty in regional precipitation change scenarios in Sweden. Results from statistical downscaling based on 17 global climate models show that there is an overall increase in annual precipitation all over Sweden although a considerable spread of the changes in the precipitation exists. The general increase can be attributed to the increased large-scale precipitation and the enhanced westerly wind. The estimated uncertainty is nearly independent of region. However, there is a seasonal dependence. The estimates for winter show the highest level of confidence, while the estimates for summer show the least.

  13. Quantifying the uncertainties of advection and boundary layer dynamics on the diurnal carbon dioxide budget

    NARCIS (Netherlands)

    Pino, D.; Kaikkonen, J.P.; Vilà-Guerau de Arellano, J.

    2013-01-01

    [1] We investigate the uncertainties in the carbon dioxide (CO2) mixing ratio and inferred surface flux associated with boundary layer processes and advection by using mixed-layer theory. By extending the previous analysis presented by Pino et al. (2012), new analytical expressions are derived to

  14. A new system to quantify uncertainties in LEO satellite position determination due to space weather events

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a new system for quantitative assessment of uncertainties in LEO satellite position caused by storm time changes in space environmental...

  15. Emulation of a complex global aerosol model to quantify sensitivity to uncertain parameters

    Directory of Open Access Journals (Sweden)

    L. A. Lee

    2011-12-01

    Full Text Available Sensitivity analysis of atmospheric models is necessary to identify the processes that lead to uncertainty in model predictions, to help understand model diversity through comparison of driving processes, and to prioritise research. Assessing the effect of parameter uncertainty in complex models is challenging and often limited by CPU constraints. Here we present a cost-effective application of variance-based sensitivity analysis to quantify the sensitivity of a 3-D global aerosol model to uncertain parameters. A Gaussian process emulator is used to estimate the model output across multi-dimensional parameter space, using information from a small number of model runs at points chosen using a Latin hypercube space-filling design. Gaussian process emulation is a Bayesian approach that uses information from the model runs along with some prior assumptions about the model behaviour to predict model output everywhere in the uncertainty space. We use the Gaussian process emulator to calculate the percentage of expected output variance explained by uncertainty in global aerosol model parameters and their interactions. To demonstrate the technique, we show examples of cloud condensation nuclei (CCN sensitivity to 8 model parameters in polluted and remote marine environments as a function of altitude. In the polluted environment 95 % of the variance of CCN concentration is described by uncertainty in the 8 parameters (excluding their interaction effects and is dominated by the uncertainty in the sulphur emissions, which explains 80 % of the variance. However, in the remote region parameter interaction effects become important, accounting for up to 40 % of the total variance. Some parameters are shown to have a negligible individual effect but a substantial interaction effect. Such sensitivities would not be detected in the commonly used single parameter perturbation experiments, which would therefore underpredict total uncertainty. Gaussian process

  16. A Unified Approach for Reporting ARM Measurement Uncertainties Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Campos, E [Argonne National Lab. (ANL), Argonne, IL (United States); Sisterson, Douglas [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-12-01

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility is observationally based, and quantifying the uncertainty of its measurements is critically important. With over 300 widely differing instruments providing over 2,500 datastreams, concise expression of measurement uncertainty is quite challenging. The ARM Facility currently provides data and supporting metadata (information about the data or data quality) to its users through a number of sources. Because the continued success of the ARM Facility depends on the known quality of its measurements, the Facility relies on instrument mentors and the ARM Data Quality Office (DQO) to ensure, assess, and report measurement quality. Therefore, an easily accessible, well-articulated estimate of ARM measurement uncertainty is needed. Note that some of the instrument observations require mathematical algorithms (retrievals) to convert a measured engineering variable into a useful geophysical measurement. While those types of retrieval measurements are identified, this study does not address particular methods for retrieval uncertainty. As well, the ARM Facility also provides engineered data products, or value-added products (VAPs), based on multiple instrument measurements. This study does not include uncertainty estimates for those data products. We propose here that a total measurement uncertainty should be calculated as a function of the instrument uncertainty (calibration factors), the field uncertainty (environmental factors), and the retrieval uncertainty (algorithm factors). The study will not expand on methods for computing these uncertainties. Instead, it will focus on the practical identification, characterization, and inventory of the measurement uncertainties already available in the ARM community through the ARM instrument mentors and their ARM instrument handbooks. As a result, this study will address the first steps towards reporting ARM measurement uncertainty

  17. Uncertainty and inference in the world of paleoecological data

    Science.gov (United States)

    McLachlan, J. S.; Dawson, A.; Dietze, M.; Finley, M.; Hooten, M.; Itter, M.; Jackson, S. T.; Marlon, J. R.; Raiho, A.; Tipton, J.; Williams, J.

    2017-12-01

    Proxy data in paleoecology and paleoclimatology share a common set of biases and uncertainties: spatiotemporal error associated with the taphonomic processes of deposition, preservation, and dating; calibration error between proxy data and the ecosystem states of interest; and error in the interpolation of calibrated estimates across space and time. Researchers often account for this daunting suite of challenges by applying qualitave expert judgment: inferring the past states of ecosystems and assessing the level of uncertainty in those states subjectively. The effectiveness of this approach can be seen by the extent to which future observations confirm previous assertions. Hierarchical Bayesian (HB) statistical approaches allow an alternative approach to accounting for multiple uncertainties in paleo data. HB estimates of ecosystem state formally account for each of the common uncertainties listed above. HB approaches can readily incorporate additional data, and data of different types into estimates of ecosystem state. And HB estimates of ecosystem state, with associated uncertainty, can be used to constrain forecasts of ecosystem dynamics based on mechanistic ecosystem models using data assimilation. Decisions about how to structure an HB model are also subjective, which creates a parallel framework for deciding how to interpret data from the deep past.Our group, the Paleoecological Observatory Network (PalEON), has applied hierarchical Bayesian statistics to formally account for uncertainties in proxy based estimates of past climate, fire, primary productivity, biomass, and vegetation composition. Our estimates often reveal new patterns of past ecosystem change, which is an unambiguously good thing, but we also often estimate a level of uncertainty that is uncomfortably high for many researchers. High levels of uncertainty are due to several features of the HB approach: spatiotemporal smoothing, the formal aggregation of multiple types of uncertainty, and a

  18. Male reproductive competition in spawning aggregations of cod ( Gadus morhua , L.)

    DEFF Research Database (Denmark)

    Bekkevold, Dorte; Hansen, Michael Møller; Loeschcke, V.

    2002-01-01

    Reproductive competition may lead to a large skew in reproductive success among individuals. Very few studies have analysed the paternity contribution of individual males in spawning aggregations of fish species with huge census population sizes. We quantified the variance in male reproductive...... success in spawning aggregations of cod under experimental conditions over an entire spawning season. Male reproductive success was estimated by microsatellite-based parentage analysis of offspring produced in six separate groups of spawning cod. In total, 1340 offspring and 102 spawnings distributed...... across a spawning season were analysed. Our results show that multiple males contributed sperm to most spawnings but that paternity frequencies were highly skewed among males, with larger males on average siring higher proportions of offspring. It was further indicated that male reproductive success...

  19. Understanding and reducing statistical uncertainties in nebular abundance determinations

    Science.gov (United States)

    Wesson, R.; Stock, D. J.; Scicluna, P.

    2012-06-01

    Whenever observations are compared to theories, an estimate of the uncertainties associated with the observations is vital if the comparison is to be meaningful. However, many or even most determinations of temperatures, densities and abundances in photoionized nebulae do not quote the associated uncertainty. Those that do typically propagate the uncertainties using analytical techniques which rely on assumptions that generally do not hold. Motivated by this issue, we have developed Nebular Empirical Analysis Tool (NEAT), a new code for calculating chemical abundances in photoionized nebulae. The code carries out a standard analysis of lists of emission lines using long-established techniques to estimate the amount of interstellar extinction, calculate representative temperatures and densities, compute ionic abundances from both collisionally excited lines and recombination lines, and finally to estimate total elemental abundances using an ionization correction scheme. NEATuses a Monte Carlo technique to robustly propagate uncertainties from line flux measurements through to the derived abundances. We show that, for typical observational data, this approach is superior to analytic estimates of uncertainties. NEAT also accounts for the effect of upward biasing on measurements of lines with low signal-to-noise ratio, allowing us to accurately quantify the effect of this bias on abundance determinations. We find not only that the effect can result in significant overestimates of heavy element abundances derived from weak lines, but also that taking it into account reduces the uncertainty of these abundance determinations. Finally, we investigate the effect of possible uncertainties in R, the ratio of selective-to-total extinction, on abundance determinations. We find that the uncertainty due to this parameter is negligible compared to the statistical uncertainties due to typical line flux measurement uncertainties.

  20. Quantifying scaling effects on satellite-derived forest area estimates for the conterminous USA

    Science.gov (United States)

    Daolan Zheng; L.S. Heath; M.J. Ducey; J.E. Smith

    2009-01-01

    We quantified the scaling effects on forest area estimates for the conterminous USA using regression analysis and the National Land Cover Dataset 30m satellite-derived maps in 2001 and 1992. The original data were aggregated to: (1) broad cover types (forest vs. non-forest); and (2) coarser resolutions (1km and 10 km). Standard errors of the model estimates were 2.3%...

  1. The known unknowns: neural representation of second-order uncertainty, and ambiguity

    Science.gov (United States)

    Bach, Dominik R.; Hulme, Oliver; Penny, William D.; Dolan, Raymond J.

    2011-01-01

    Predictions provided by action-outcome probabilities entail a degree of (first-order) uncertainty. However, these probabilities themselves can be imprecise and embody second-order uncertainty. Tracking second-order uncertainty is important for optimal decision making and reinforcement learning. Previous functional magnetic resonance imaging investigations of second-order uncertainty in humans have drawn on an economic concept of ambiguity, where action-outcome associations in a gamble are either known (unambiguous) or completely unknown (ambiguous). Here, we relaxed the constraints associated with a purely categorical concept of ambiguity and varied the second-order uncertainty of gambles continuously, quantified as entropy over second-order probabilities. We show that second-order uncertainty influences decisions in a pessimistic way by biasing second-order probabilities, and that second-order uncertainty is negatively correlated with posterior cingulate cortex activity. The category of ambiguous (compared to non-ambiguous) gambles also biased choice in a similar direction, but was associated with distinct activation of a posterior parietal cortical area; an activation that we show reflects a different computational mechanism. Our findings indicate that behavioural and neural responses to second-order uncertainty are distinct from those associated with ambiguity and may call for a reappraisal of previous data. PMID:21451019

  2. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Science.gov (United States)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  3. A Bayesian analysis of sensible heat flux estimation: Quantifying uncertainty in meteorological forcing to improve model prediction

    KAUST Repository

    Ershadi, Ali

    2013-05-01

    The influence of uncertainty in land surface temperature, air temperature, and wind speed on the estimation of sensible heat flux is analyzed using a Bayesian inference technique applied to the Surface Energy Balance System (SEBS) model. The Bayesian approach allows for an explicit quantification of the uncertainties in input variables: a source of error generally ignored in surface heat flux estimation. An application using field measurements from the Soil Moisture Experiment 2002 is presented. The spatial variability of selected input meteorological variables in a multitower site is used to formulate the prior estimates for the sampling uncertainties, and the likelihood function is formulated assuming Gaussian errors in the SEBS model. Land surface temperature, air temperature, and wind speed were estimated by sampling their posterior distribution using a Markov chain Monte Carlo algorithm. Results verify that Bayesian-inferred air temperature and wind speed were generally consistent with those observed at the towers, suggesting that local observations of these variables were spatially representative. Uncertainties in the land surface temperature appear to have the strongest effect on the estimated sensible heat flux, with Bayesian-inferred values differing by up to ±5°C from the observed data. These differences suggest that the footprint of the in situ measured land surface temperature is not representative of the larger-scale variability. As such, these measurements should be used with caution in the calculation of surface heat fluxes and highlight the importance of capturing the spatial variability in the land surface temperature: particularly, for remote sensing retrieval algorithms that use this variable for flux estimation.

  4. Economic risk-based analysis: Effect of technical and market price uncertainties on the production of glycerol-based isobutanol

    DEFF Research Database (Denmark)

    Loureiro da Costa Lira Gargalo, Carina; Gernaey, Krist; Sin, Gürkan

    2016-01-01

    to propagate the market price and technical uncertainties to the economic indicator calculations and to quantify the respective economic risk. The results clearly indicated that under the given market price uncertainties, the probability of obtaining a negative NPV is 0.95. This is a very high probability...

  5. Nuclear Data Uncertainty Propagation to Reactivity Coefficients of a Sodium Fast Reactor

    Science.gov (United States)

    Herrero, J. J.; Ochoa, R.; Martínez, J. S.; Díez, C. J.; García-Herranz, N.; Cabellos, O.

    2014-04-01

    The assessment of the uncertainty levels on the design and safety parameters for the innovative European Sodium Fast Reactor (ESFR) is mandatory. Some of these relevant safety quantities are the Doppler and void reactivity coefficients, whose uncertainties are quantified. Besides, the nuclear reaction data where an improvement will certainly benefit the design accuracy are identified. This work has been performed with the SCALE 6.1 codes suite and its multigroups cross sections library based on ENDF/B-VII.0 evaluation.

  6. Soil aggregate stability as an indicator for eco-engineering effectiveness?

    Science.gov (United States)

    Graf, Frank

    2015-04-01

    Eco-engineering aims at stabilising soil and slopes by applying technical and biological measures. Engineering structures are commonly well defined, immediately usable and operative, and their stability effects quantifiable and verifiable. Differently, the use of plants requires more restrictive boundary conditions and the protection potential is rarely easily calculable and develop-ing as a function of growth rate. Although the use of vegetation is widely appreciated and their stabilising effect recognised, there is an increasing demand on sound facts on its efficiency, in particular, in relation to time. Conclusively, a certain necessity has been recognised to monitor, assess and quantify the effectiveness of ecological restora-tion measures in order to facilitate the transfer of technology and knowledge. Recent theoretical models emphasize the im-portance of taking an integrated monitoring approach that considers multiple variables. However, limited financial and time resources often prevent such comprehensive assessments. A solution to this problem may be to use integrated indicators that reflect multiple aspects and, therefore, allow extensive information on ecosystem status to be gathered in a relatively short time. Among various other indicators, such as fractal dimension of soil particle size distribution or microbiological parameters, soil aggregate stability seems the most appropriate indicator with regard to protecting slopes from superficial soil failure as it is critical to both plant growth and soil structure. Soil aggregation processes play a crucial role in re-establishing soil structure and function and, conclusively, for successful and sustainable re-colonisation. Whereas the key role of soil aggregate stability in ecosystem functioning is well known concerning water, gas, and nutrient fluxes, only limited information is available with regard to soil mechanical and geotechnical aspects. Correspondingly, in the last couple of years several studies

  7. Optimal investment and scheduling of distributed energy resources with uncertainty in electric vehicle driving schedules

    International Nuclear Information System (INIS)

    Cardoso, G.; Stadler, M.; Bozchalui, M.C.; Sharma, R.; Marnay, C.; Barbosa-Póvoa, A.; Ferrão, P.

    2014-01-01

    The large scale penetration of electric vehicles (EVs) will introduce technical challenges to the distribution grid, but also carries the potential for vehicle-to-grid services. Namely, if available in large enough numbers, EVs can be used as a distributed energy resource (DER) and their presence can influence optimal DER investment and scheduling decisions in microgrids. In this work, a novel EV fleet aggregator model is introduced in a stochastic formulation of DER-CAM [1], an optimization tool used to address DER investment and scheduling problems. This is used to assess the impact of EV interconnections on optimal DER solutions considering uncertainty in EV driving schedules. Optimization results indicate that EVs can have a significant impact on DER investments, particularly if considering short payback periods. Furthermore, results suggest that uncertainty in driving schedules carries little significance to total energy costs, which is corroborated by results obtained using the stochastic formulation of the problem. - Highlights: • This paper introduces a new EV aggregator model in the DER-CAM model and expands it with a stochastic formulation. • The model is used to analyze the impact of EVs in DER investment decisions in a large office building. • The uncertainty in EV driving patterns is considered through scenarios based on data from a daily commute driving survey. • Results indicate that EVs have a significant impact in optimal DER decisions, particularly when looking at short payback periods. • Furthermore, results indicate that uncertainty in EV driving schedules has little impact on DER investment decisions

  8. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    International Nuclear Information System (INIS)

    BABA, T.; ISHIGURO, K.; ISHIHARA, Y.; SAWADA, A.; UMEKI, H.; WAKASUGI, K.; WEBB, ERIK K.

    1999-01-01

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs were defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment

  9. Error correction in multi-fidelity molecular dynamics simulations using functional uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Reeve, Samuel Temple; Strachan, Alejandro, E-mail: strachan@purdue.edu

    2017-04-01

    We use functional, Fréchet, derivatives to quantify how thermodynamic outputs of a molecular dynamics (MD) simulation depend on the potential used to compute atomic interactions. Our approach quantifies the sensitivity of the quantities of interest with respect to the input functions as opposed to its parameters as is done in typical uncertainty quantification methods. We show that the functional sensitivity of the average potential energy and pressure in isothermal, isochoric MD simulations using Lennard–Jones two-body interactions can be used to accurately predict those properties for other interatomic potentials (with different functional forms) without re-running the simulations. This is demonstrated under three different thermodynamic conditions, namely a crystal at room temperature, a liquid at ambient pressure, and a high pressure liquid. The method provides accurate predictions as long as the change in potential can be reasonably described to first order and does not significantly affect the region in phase space explored by the simulation. The functional uncertainty quantification approach can be used to estimate the uncertainties associated with constitutive models used in the simulation and to correct predictions if a more accurate representation becomes available.

  10. Uncertainty of chromatic dispersion estimation from transmitted waveforms in direct detection systems

    Science.gov (United States)

    Lach, Zbigniew T.

    2017-08-01

    A possibility is shown of a non-disruptive estimation of chromatic dispersion in a fiber of an intensity modulation communication line under work conditions. Uncertainty of the chromatic dispersion estimates is analyzed and quantified with the use of confidence intervals.

  11. Assessing Groundwater Model Uncertainty for the Central Nevada Test Area

    International Nuclear Information System (INIS)

    Pohll, Greg; Pohlmann, Karl; Hassan, Ahmed; Chapman, Jenny; Mihevc, Todd

    2002-01-01

    The purpose of this study is to quantify the flow and transport model uncertainty for the Central Nevada Test Area (CNTA). Six parameters were identified as uncertain, including the specified head boundary conditions used in the flow model, the spatial distribution of the underlying welded tuff unit, effective porosity, sorption coefficients, matrix diffusion coefficient, and the geochemical release function which describes nuclear glass dissolution. The parameter uncertainty was described by assigning prior statistical distributions for each of these parameters. Standard Monte Carlo techniques were used to sample from the parameter distributions to determine the full prediction uncertainty. Additional analysis is performed to determine the most cost-beneficial characterization activities. The maximum radius of the tritium and strontium-90 contaminant boundary was used as the output metric for evaluation of prediction uncertainty. The results indicate that combining all of the uncertainty in the parameters listed above propagates to a prediction uncertainty in the maximum radius of the contaminant boundary of 234 to 308 m and 234 to 302 m, for tritium and strontium-90, respectively. Although the uncertainty in the input parameters is large, the prediction uncertainty in the contaminant boundary is relatively small. The relatively small prediction uncertainty is primarily due to the small transport velocities such that large changes in the uncertain input parameters causes small changes in the contaminant boundary. This suggests that the model is suitable in terms of predictive capability for the contaminant boundary delineation

  12. Working fluid selection for organic Rankine cycles - Impact of uncertainty of fluid properties

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Andreasen, Jesper Graa; Liu, Wei

    2016-01-01

    of processmodels and constraints 2) selection of property models, i.e. Penge Robinson equation of state 3)screening of 1965 possible working fluid candidates including identification of optimal process parametersbased on Monte Carlo sampling 4) propagating uncertainty of fluid parameters to the ORC netpower output......This study presents a generic methodology to select working fluids for ORC (Organic Rankine Cycles)taking into account property uncertainties of the working fluids. A Monte Carlo procedure is described as a tool to propagate the influence of the input uncertainty of the fluid parameters on the ORC....... The net power outputs of all the feasible working fluids were ranked including their uncertainties. The method could propagate and quantify the input property uncertainty of the fluidproperty parameters to the ORC model, giving an additional dimension to the fluid selection process. In the given analysis...

  13. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Science.gov (United States)

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  14. Molecular nonlinear dynamics and protein thermal uncertainty quantification

    Science.gov (United States)

    Xia, Kelin; Wei, Guo-Wei

    2014-01-01

    This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction. PMID:24697365

  15. Effect of monthly areal rainfall uncertainty on streamflow simulation

    Science.gov (United States)

    Ndiritu, J. G.; Mkhize, N.

    2017-08-01

    Areal rainfall is mostly obtained from point rainfall measurements that are sparsely located and several studies have shown that this results in large areal rainfall uncertainties at the daily time step. However, water resources assessment is often carried out a monthly time step and streamflow simulation is usually an essential component of this assessment. This study set out to quantify monthly areal rainfall uncertainties and assess their effect on streamflow simulation. This was achieved by; i) quantifying areal rainfall uncertainties and using these to generate stochastic monthly areal rainfalls, and ii) finding out how the quality of monthly streamflow simulation and streamflow variability change if stochastic areal rainfalls are used instead of historic areal rainfalls. Tests on monthly rainfall uncertainty were carried out using data from two South African catchments while streamflow simulation was confined to one of them. A non-parametric model that had been applied at a daily time step was used for stochastic areal rainfall generation and the Pitman catchment model calibrated using the SCE-UA optimizer was used for streamflow simulation. 100 randomly-initialised calibration-validation runs using 100 stochastic areal rainfalls were compared with 100 runs obtained using the single historic areal rainfall series. By using 4 rain gauges alternately to obtain areal rainfall, the resulting differences in areal rainfall averaged to 20% of the mean monthly areal rainfall and rainfall uncertainty was therefore highly significant. Pitman model simulations obtained coefficient of efficiencies averaging 0.66 and 0.64 in calibration and validation using historic rainfalls while the respective values using stochastic areal rainfalls were 0.59 and 0.57. Average bias was less than 5% in all cases. The streamflow ranges using historic rainfalls averaged to 29% of the mean naturalised flow in calibration and validation and the respective average ranges using stochastic

  16. Parameters-related uncertainty in modeling sugar cane yield with an agro-Land Surface Model

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Ruget, F.; Gabrielle, B.

    2012-12-01

    Agro-Land Surface Models (agro-LSM) have been developed from the coupling of specific crop models and large-scale generic vegetation models. They aim at accounting for the spatial distribution and variability of energy, water and carbon fluxes within soil-vegetation-atmosphere continuum with a particular emphasis on how crop phenology and agricultural management practice influence the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty in these models is related to the many parameters included in the models' equations. In this study, we quantify the parameter-based uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS on a multi-regional approach with data from sites in Australia, La Reunion and Brazil. First, the main source of uncertainty for the output variables NPP, GPP, and sensible heat flux (SH) is determined through a screening of the main parameters of the model on a multi-site basis leading to the selection of a subset of most sensitive parameters causing most of the uncertainty. In a second step, a sensitivity analysis is carried out on the parameters selected from the screening analysis at a regional scale. For this, a Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used. First, we quantify the sensitivity of the output variables to individual input parameters on a regional scale for two regions of intensive sugar cane cultivation in Australia and Brazil. Then, we quantify the overall uncertainty in the simulation's outputs propagated from the uncertainty in the input parameters. Seven parameters are identified by the screening procedure as driving most of the uncertainty in the agro-LSM ORCHIDEE-STICS model output at all sites. These parameters control photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), root

  17. Sea Level Forecasts Aggregated from Established Operational Systems

    Directory of Open Access Journals (Sweden)

    Andy Taylor

    2017-08-01

    Full Text Available A system for providing routine seven-day forecasts of sea level observable at tide gauge locations is described and evaluated. Forecast time series are aggregated from well-established operational systems of the Australian Bureau of Meteorology; although following some adjustments these systems are only quasi-complimentary. Target applications are routine coastal decision processes under non-extreme conditions. The configuration aims to be relatively robust to operational realities such as version upgrades, data gaps and metadata ambiguities. Forecast skill is evaluated against hourly tide gauge observations. Characteristics of the bias correction term are demonstrated to be primarily static in time, with time varying signals showing regional coherence. This simple approach to exploiting existing complex systems can offer valuable levels of skill at a range of Australian locations. The prospect of interpolation between observation sites and exploitation of lagged-ensemble uncertainty estimates could be meaningfully pursued. Skill characteristics define a benchmark against which new operational sea level forecasting systems can be measured. More generally, an aggregation approach may prove to be optimal for routine sea level forecast services given the physically inhomogeneous processes involved and ability to incorporate ongoing improvements and extensions of source systems.

  18. The Effect of Morphological Characteristic of Coarse Aggregates Measured with Fractal Dimension on Asphalt Mixture’s High-Temperature Performance

    Directory of Open Access Journals (Sweden)

    Hainian Wang

    2016-01-01

    Full Text Available The morphological properties of coarse aggregates, such as shape, angularity, and surface texture, have a great influence on the mechanical performance of asphalt mixtures. This study aims to investigate the effect of coarse aggregate morphological properties on the high-temperature performance of asphalt mixtures. A modified Los Angeles (LA abrasion test was employed to produce aggregates with various morphological properties by applying abrasion cycles of 0, 200, 400, 600, 800, 1000, and 1200 on crushed angular aggregates. Based on a laboratory-developed Morphology Analysis System for Coarse Aggregates (MASCA, the morphological properties of the coarse aggregate particles were quantified using the index of fractal dimension. The high-temperature performances of the dense-graded asphalt mixture (AC-16, gap-graded stone asphalt mixture (SAC-16, and stone mastic asphalt (SMA-16 mixtures containing aggregates with different fractal dimensions were evaluated through the dynamic stability (DS test and the penetration shear test in laboratory. Good linear correlations between the fractal dimension and high-temperature indexes were obtained for all three types of mixtures. Moreover, the results also indicated that higher coarse aggregate angularity leads to stronger high-temperature shear resistance of asphalt mixtures.

  19. Uncertainty and sensitivity analysis applied to coupled code calculations for a VVER plant transient

    International Nuclear Information System (INIS)

    Langenbuch, S.; Krzykacz-Hausmann, B.; Schmidt, K. D.

    2004-01-01

    The development of coupled codes, combining thermal-hydraulic system codes and 3D neutron kinetics, is an important step to perform best-estimate plant transient calculations. It is generally agreed that the application of best-estimate methods should be supplemented by an uncertainty and sensitivity analysis to quantify the uncertainty of the results. The paper presents results from the application of the GRS uncertainty and sensitivity method for a VVER-440 plant transient, which was already studied earlier for the validation of coupled codes. For this application, the main steps of the uncertainty method are described. Typical results of the method applied to the analysis of the plant transient by several working groups using different coupled codes are presented and discussed The results demonstrate the capability of an uncertainty and sensitivity analysis. (authors)

  20. Can Bayesian Belief Networks help tackling conceptual model uncertainties in contaminated site risk assessment?

    DEFF Research Database (Denmark)

    Troldborg, Mads; Thomsen, Nanna Isbak; McKnight, Ursula S.

    different conceptual models may describe the same contaminated site equally well. In many cases, conceptual model uncertainty has been shown to be one of the dominant sources for uncertainty and is therefore essential to account for when quantifying uncertainties in risk assessments. We present here......A key component in risk assessment of contaminated sites is the formulation of a conceptual site model. The conceptual model is a simplified representation of reality and forms the basis for the mathematical modelling of contaminant fate and transport at the site. A conceptual model should...... a Bayesian Belief Network (BBN) approach for evaluating the uncertainty in risk assessment of groundwater contamination from contaminated sites. The approach accounts for conceptual model uncertainty by considering multiple conceptual models, each of which represents an alternative interpretation of the site...

  1. Mass discharge estimation from contaminated sites: Multi-model solutions for assessment of conceptual uncertainty

    DEFF Research Database (Denmark)

    Thomsen, Nanna Isbak; Troldborg, Mads; McKnight, Ursula S.

    2012-01-01

    site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level...... the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We...... propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same...

  2. Uncertainty analysis with a view towards applications in accident consequence assessments

    International Nuclear Information System (INIS)

    Fischer, F.; Erhardt, J.

    1985-09-01

    Since the publication of the US-Reactor Safety Study WASH-1400 there has been an increasing interest to develop and apply methods which allow to quantify the uncertainty inherent in probabilistic risk assessments (PRAs) and accident consequence assessments (ACAs) for installations of the nuclear fuel cycle. Research and development in this area is forced by the fact that PRA and ACA are more and more used for comparative, decisive and fact finding studies initiated by industry and regulatory commissions. This report summarizes and reviews some of the main methods and gives some hints to do sensitivity and uncertainty analyses. Some first investigations aiming at the application of the method mentioned above to a submodel of the ACA-code UFOMOD (KfK) are presented. Sensitivity analyses and some uncertainty studies an important submodel of UFOMOD are carried out to identify the relevant parameters for subsequent uncertainty calculations. (orig./HP) [de

  3. Uncertainty Evaluation of the Thermal Expansion of Gd2O3-ZrO2 with a System Calibration Factor

    International Nuclear Information System (INIS)

    Park, Chang Je; Kang, Kweon Ho; Na, Sang Ho; Song, Kee Chan

    2007-01-01

    Both gadolinia (Gd 2 O 3 ) and zirconia (ZrO 2 ) are widely used in the nuclear industry, including a burnable absorber and additives in the fabrication of a simulated fuel. Thermal expansions of a mixture of gadolinia (Gd 2 O 3 ) 20 mol% and zirconia (ZrO 2 ) 80 mol% were measured by using a dilatometer (DIL402C) from room temperature to 1500 .deg. C. Uncertainties in the measurement should be quantified based on statistics. Referring to the ISO (International Organization for Standardization) guide, the uncertainties of the thermal expansion were quantified for three parts - the initial length, the length variation, and the system calibration factor. The whole system, the dilatometer, is composed of many complex sub-systems and in fact it is difficult to consider all the uncertainties of the sub-systems. Thus, the system calibration factor was introduced with a standard material for the uncertainty evaluation. In this study, a new system calibration factor was formulated in a multiplicative way. Further, the effect of calibration factor with random deviation was investigated for the uncertainty evaluation of a thermal expansion

  4. Marine Synechococcus Aggregation

    Science.gov (United States)

    Neuer, S.; Deng, W.; Cruz, B. N.; Monks, L.

    2016-02-01

    Cyanobacteria are considered to play an important role in the oceanic biological carbon pump, especially in oligotrophic regions. But as single cells are too small to sink, their carbon export has to be mediated by aggregate formation and possible consumption by zooplankton producing sinking fecal pellets. Here we report results on the aggregation of the ubiquitous marine pico-cyanobacterium Synechococcus as a model organism. We first investigated the mechanism behind such aggregation by studying the potential role of transparent exopolymeric particles (TEP) and the effects of nutrient (nitrogen or phosphorus) limitation on the TEP production and aggregate formation of these pico-cyanobacteria. We further studied the aggregation and subsequent settling in roller tanks and investigated the effects of the clays kaolinite and bentonite in a series of concentrations. Our results show that despite of the lowered growth rates, Synechococcus in nutrient limited cultures had larger cell-normalized TEP production, formed a greater volume of aggregates, and resulted in higher settling velocities compared to results from replete cultures. In addition, we found that despite their small size and lack of natural ballasting minerals, Synechococcus cells could still form aggregates and sink at measureable velocities in seawater. Clay minerals increased the number and reduced the size of aggregates, and their ballasting effects increased the sinking velocity and carbon export potential of aggregates. In comparison with the Synechococcus, we will also present results of the aggregation of the pico-cyanobacterium Prochlorococcus in roller tanks. These results contribute to our understanding in the physiology of marine Synechococcus as well as their role in the ecology and biogeochemistry in oligotrophic oceans.

  5. Can reduction of uncertainties in cervix cancer brachytherapy potentially improve clinical outcome?

    DEFF Research Database (Denmark)

    Nesvacil, Nicole; Tanderup, Kari; Lindegaard, Jacob C

    2016-01-01

    AIM: The aim of this study was to quantify the impact of different types and magnitudes of dosimetric uncertainties in cervix cancer brachytherapy (BT) on tumour control probability (TCP) and normal tissue complication probability (NTCP) curves. MATERIALS AND METHODS: A dose-response simulation...

  6. Quantifying geological uncertainty in metamorphic phase equilibria modelling; a Monte Carlo assessment and implications for tectonic interpretations

    Directory of Open Access Journals (Sweden)

    Richard M. Palin

    2016-07-01

    Full Text Available Pseudosection modelling is rapidly becoming an essential part of a petrologist's toolkit and often forms the basis of interpreting the tectonothermal evolution of a rock sample, outcrop, or geological region. Of the several factors that can affect the accuracy and precision of such calculated phase diagrams, “geological” uncertainty related to natural petrographic variation at the hand sample- and/or thin section-scale is rarely considered. Such uncertainty influences the sample's bulk composition, which is the primary control on its equilibrium phase relationships and thus the interpreted pressure–temperature (P–T conditions of formation. Two case study examples—a garnet–cordierite granofels and a garnet–staurolite–kyanite schist—are used to compare the relative importance that geological uncertainty has on bulk compositions determined via (1 X-ray fluorescence (XRF or (2 point counting techniques. We show that only minor mineralogical variation at the thin-section scale propagates through the phase equilibria modelling procedure and affects the absolute P–T conditions at which key assemblages are stable. Absolute displacements of equilibria can approach ±1 kbar for only a moderate degree of modal proportion uncertainty, thus being essentially similar to the magnitudes reported for analytical uncertainties in conventional thermobarometry. Bulk compositions determined from multiple thin sections of a heterogeneous garnet–staurolite–kyanite schist show a wide range in major-element oxides, owing to notable variation in mineral proportions. Pseudosections constructed for individual point count-derived bulks accurately reproduce this variability on a case-by-case basis, though averaged proportions do not correlate with those calculated at equivalent peak P–T conditions for a whole-rock XRF-derived bulk composition. The main discrepancies relate to varying proportions of matrix phases (primarily mica relative to

  7. Quantifying confidence in density functional theory predictions of magnetic ground states

    Science.gov (United States)

    Houchins, Gregory; Viswanathan, Venkatasubramanian

    2017-10-01

    Density functional theory (DFT) simulations, at the generalized gradient approximation (GGA) level, are being routinely used for material discovery based on high-throughput descriptor-based searches. The success of descriptor-based material design relies on eliminating bad candidates and keeping good candidates for further investigation. While DFT has been widely successfully for the former, oftentimes good candidates are lost due to the uncertainty associated with the DFT-predicted material properties. Uncertainty associated with DFT predictions has gained prominence and has led to the development of exchange correlation functionals that have built-in error estimation capability. In this work, we demonstrate the use of built-in error estimation capabilities within the BEEF-vdW exchange correlation functional for quantifying the uncertainty associated with the magnetic ground state of solids. We demonstrate this approach by calculating the uncertainty estimate for the energy difference between the different magnetic states of solids and compare them against a range of GGA exchange correlation functionals as is done in many first-principles calculations of materials. We show that this estimate reasonably bounds the range of values obtained with the different GGA functionals. The estimate is determined as a postprocessing step and thus provides a computationally robust and systematic approach to estimating uncertainty associated with predictions of magnetic ground states. We define a confidence value (c-value) that incorporates all calculated magnetic states in order to quantify the concurrence of the prediction at the GGA level and argue that predictions of magnetic ground states from GGA level DFT is incomplete without an accompanying c-value. We demonstrate the utility of this method using a case study of Li-ion and Na-ion cathode materials and the c-value metric correctly identifies that GGA-level DFT will have low predictability for NaFePO4F . Further, there

  8. Impacts of uncertainties in weather and streamflow observations in calibration and evaluation of an elevation distributed HBV-model

    Science.gov (United States)

    Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.

    2012-04-01

    The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station

  9. Quantifying the Uncertainty in High Spatial and Temporal Resolution Synthetic Land Surface Reflectance at Pixel Level Using Ground-Based Measurements

    Science.gov (United States)

    Kong, J.; Ryu, Y.

    2017-12-01

    Algorithms for fusing high temporal frequency and high spatial resolution satellite images are widely used to develop dense time-series land surface observations. While many studies have revealed that the synthesized frequent high spatial resolution images could be successfully applied in vegetation mapping and monitoring, validation and correction of fused images have not been focused than its importance. To evaluate the precision of fused image in pixel level, in-situ reflectance measurements which could account for the pixel-level heterogeneity are necessary. In this study, the synthetic images of land surface reflectance were predicted by the coarse high-frequency images acquired from MODIS and high spatial resolution images from Landsat-8 OLI using the Flexible Spatiotemporal Data Fusion (FSDAF). Ground-based reflectance was measured by JAZ Spectrometer (Ocean Optics, Dunedin, FL, USA) on rice paddy during five main growth stages in Cheorwon-gun, Republic of Korea, where the landscape heterogeneity changes through the growing season. After analyzing the spatial heterogeneity and seasonal variation of land surface reflectance based on the ground measurements, the uncertainties of the fused images were quantified at pixel level. Finally, this relationship was applied to correct the fused reflectance images and build the seasonal time series of rice paddy surface reflectance. This dataset could be significant for rice planting area extraction, phenological stages detection, and variables estimation.

  10. Global impact of uncertainties in China’s gas market

    International Nuclear Information System (INIS)

    Xunpeng, Shi; Variam, Hari Malamakkavu Padinjare; Tao, Jacqueline

    2017-01-01

    This paper examines the uncertainties in Chinese gas markets, analyze the reasons and quantify their impact on the world gas market. A literature review found significant variability among the outlooks on China's gas sector. Further assessment found that uncertainties in economic growth, structural change in markets, environmental regulations, price and institutional changes contribute to the uncertainties. The analysis of China’s demand and supply uncertainties with a world gas-trading model found significant changes in global production, trade patterns and spot prices, with pipeline exporters being most affected. China's domestic production and pipeline imports from Central Asia are the major buffers that can offset much of the uncertainties. The study finds an asymmetric phenomenon. Pipeline imports are responding to China's uncertainties in both low and high demand scenarios while LNG imports are only responding to high demand scenario. The major reasons are higher TOP levels and the current practice of import only up to the minimum TOP levels for LNG, as well as a lack of liberalized gas markets. The study shows that it is necessary to create LNG markets that can respond to market dynamics, through either a reduction of TOP levels or change of pricing mechanisms to hub indexation. - Highlights: • Economic growth, regulations, reforms and shale gas cause the uncertainties. • Pipeline exporters to China and Southeast Asian and Australian LNG exporters affected the most. • China’s domestic production and pipe imports offset much of the uncertainties. • Pipeline imports are responding to China’s uncertainties in both low and high demand. • LNG imports are only responding to high demand scenario.

  11. Evaluation of Fatigue Crack Propagation of Gears Considering Uncertainties in Loading and Material Properties

    Directory of Open Access Journals (Sweden)

    Haileyesus B. Endeshaw

    2017-11-01

    Full Text Available Failure prediction of wind turbine gearboxes (WTGs is especially important since the maintenance of these components is not only costly but also causes the longest downtime. One of the most common causes of the premature fault of WTGs is attributed to the fatigue fracture of gear teeth due to fluctuating and cyclic torque, resulting from stochastic wind loading, transmitted to the gearbox. Moreover, the fluctuation of the torque, as well as the inherent uncertainties of the material properties, results in uncertain life prediction for WTGs. It is therefore essential to quantify these uncertainties in the life estimation of gears. In this paper, a framework, constituted by a dynamic model of a one-stage gearbox, a finite element method, and a degradation model for the estimation of fatigue crack propagation in gear, is presented. Torque time history data of a wind turbine rotor was scaled and used to simulate the stochastic characteristic of the loading and uncertainties in the material constants of the degradation model were also quantified. It was demonstrated that uncertainty quantification of load and material constants provides a reasonable estimation of the distribution of the crack length in the gear tooth at any time step.

  12. Task Uncertainty Can Account for Mixing and Switch Costs in Task-Switching

    Science.gov (United States)

    Rennie, Jaime L.

    2015-01-01

    Cognitive control is required in situations that involve uncertainty or change, such as when resolving conflict, selecting responses and switching tasks. Recently, it has been suggested that cognitive control can be conceptualised as a mechanism which prioritises goal-relevant information to deal with uncertainty. This hypothesis has been supported using a paradigm that requires conflict resolution. In this study, we examine whether cognitive control during task switching is also consistent with this notion. We used information theory to quantify the level of uncertainty in different trial types during a cued task-switching paradigm. We test the hypothesis that differences in uncertainty between task repeat and task switch trials can account for typical behavioural effects in task-switching. Increasing uncertainty was associated with less efficient performance (i.e., slower and less accurate), particularly on switch trials and trials that afford little opportunity for advance preparation. Interestingly, both mixing and switch costs were associated with a common episodic control process. These results support the notion that cognitive control may be conceptualised as an information processor that serves to resolve uncertainty in the environment. PMID:26107646

  13. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    Science.gov (United States)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data

  14. Global sensitivity analysis in wastewater treatment plant model applications: Prioritizing sources of uncertainty

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements...... insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants....... a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R2...

  15. Determination of Wave Model Uncertainties used for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...... be implemented in probabilistic reliability assessments....

  16. Managing Groundwater Recharge and Pumping for Late Summer Streamflow Increases: Quantifying Uncertainty Using Null Space Monte Carlo

    Science.gov (United States)

    Tolley, D. G., III; Foglia, L.; Harter, T.

    2017-12-01

    Late summer and early fall streamflow decreases caused by climate change and agricultural pumping contribute to increased water temperatures and result in large disconnected sections during dry years in many semi-arid regions with Mediterranean climate. This negatively impacts aquatic habitat of fish species such as coho and fall-run Chinook salmon. In collaboration with local stakeholders, the Scott Valley Integrated Hydrologic Model (SVIHMv3) was developed to assess future water management scenarios with the goal of improving aquatic species habitat while maintaining agricultural production in the valley. The Null Space Monte Carlo (NSMC) method available in PEST was used to quantify the range of predicted streamflow changes for three conjunctive use scenarios: 1) managed aquifer recharge (MAR), 2) in lieu recharge (ILR, substituting surface-water irrigation for irrigation with groundwater while flows are available), and 3) MAR + ILR. Random parameter sets were generated using the calibrated covariance matrix of the model, which were then recalibrated if the sum of squared residuals was greater than 10% of the original sum of squared weighted residuals. These calibration-constrained stochastic parameter sets were then used to obtain a distribution of streamflow changes resulting from implementing the conjunctive use scenarios. Preliminary results show that while the range of streamflow increases using managed aquifer recharge is much narrower (i.e., greater degree of certainty) than in lieu recharge, there are potentially much greater benefits to streamflow by implementing in lieu recharge (although also greater costs). Combining the two scenarios provides the greatest benefit for increasing late summer and early fall streamflow, as most of the MAR streamflow increases are during the spring and early summer which ILR is able to take advantage of. Incorporation of uncertainty into model predictions is critical for establishing and maintaining stakeholder trust

  17. Determining the nuclear data uncertainty on MONK10 and WIMS10 criticality calculations

    Science.gov (United States)

    Ware, Tim; Dobson, Geoff; Hanlon, David; Hiles, Richard; Mason, Robert; Perry, Ray

    2017-09-01

    The ANSWERS Software Service is developing a number of techniques to better understand and quantify uncertainty on calculations of the neutron multiplication factor, k-effective, in nuclear fuel and other systems containing fissile material. The uncertainty on the calculated k-effective arises from a number of sources, including nuclear data uncertainties, manufacturing tolerances, modelling approximations and, for Monte Carlo simulation, stochastic uncertainty. For determining the uncertainties due to nuclear data, a set of application libraries have been generated for use with the MONK10 Monte Carlo and the WIMS10 deterministic criticality and reactor physics codes. This paper overviews the generation of these nuclear data libraries by Latin hypercube sampling of JEFF-3.1.2 evaluated data based upon a library of covariance data taken from JEFF, ENDF/B, JENDL and TENDL evaluations. Criticality calculations have been performed with MONK10 and WIMS10 using these sampled libraries for a number of benchmark models of fissile systems. Results are presented which show the uncertainty on k-effective for these systems arising from the uncertainty on the input nuclear data.

  18. GRS Method for Uncertainty and Sensitivity Evaluation of Code Results and Applications

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    During the recent years, an increasing interest in computational reactor safety analysis is to replace the conservative evaluation model calculations by best estimate calculations supplemented by uncertainty analysis of the code results. The evaluation of the margin to acceptance criteria, for example, the maximum fuel rod clad temperature, should be based on the upper limit of the calculated uncertainty range. Uncertainty analysis is needed if useful conclusions are to be obtained from best estimate thermal-hydraulic code calculations, otherwise single values of unknown accuracy would be presented for comparison with regulatory acceptance limits. Methods have been developed and presented to quantify the uncertainty of computer code results. The basic techniques proposed by GRS are presented together with applications to a large break loss of coolant accident on a reference reactor as well as on an experiment simulating containment behaviour

  19. Defining and systematic analyses of aggregation indices to evaluate degree of calcium oxalate crystal aggregation

    Science.gov (United States)

    Chaiyarit, Sakdithep; Thongboonkerd, Visith

    2017-12-01

    Crystal aggregation is one of the most crucial steps in kidney stone pathogenesis. However, previous studies of crystal aggregation were rarely done and quantitative analysis of aggregation degree was handicapped by a lack of the standard measurement. We thus performed an in vitro assay to generate aggregation of calcium oxalate monohydrate (COM) crystals with various concentrations (25-800 µg/ml) in saturated aggregation buffer. The crystal aggregates were analyzed by microscopic examination, UV-visible spectrophotometry, and GraphPad Prism6 software to define a total of 12 aggregation indices (including number of aggregates, aggregated mass index, optical density, aggregation coefficient, span, number of aggregates at plateau time-point, aggregated area index, aggregated diameter index, aggregated symmetry index, time constant, half-life, and rate constant). The data showed linear correlation between crystal concentration and almost all of these indices, except only for rate constant. Among these, number of aggregates provided the greatest regression coefficient (r=0.997; pr=0.993; pr=‑0.993; pr=0.991; p<0.001 for both). These five indices are thus recommended as the most appropriate indices for quantitative analysis of COM crystal aggregation in vitro.

  20. Evaluation on uncertainty sources in projecting hydrological changes over the Xijiang River basin in South China

    Science.gov (United States)

    Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren

    2017-11-01

    Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang

  1. Quantifying Key Climate Parameter Uncertainties Using an Earth System Model with a Dynamic 3D Ocean

    Science.gov (United States)

    Olson, R.; Sriver, R. L.; Goes, M. P.; Urban, N.; Matthews, D.; Haran, M.; Keller, K.

    2011-12-01

    Climate projections hinge critically on uncertain climate model parameters such as climate sensitivity, vertical ocean diffusivity and anthropogenic sulfate aerosol forcings. Climate sensitivity is defined as the equilibrium global mean temperature response to a doubling of atmospheric CO2 concentrations. Vertical ocean diffusivity parameterizes sub-grid scale ocean vertical mixing processes. These parameters are typically estimated using Intermediate Complexity Earth System Models (EMICs) that lack a full 3D representation of the oceans, thereby neglecting the effects of mixing on ocean dynamics and meridional overturning. We improve on these studies by employing an EMIC with a dynamic 3D ocean model to estimate these parameters. We carry out historical climate simulations with the University of Victoria Earth System Climate Model (UVic ESCM) varying parameters that affect climate sensitivity, vertical ocean mixing, and effects of anthropogenic sulfate aerosols. We use a Bayesian approach whereby the likelihood of each parameter combination depends on how well the model simulates surface air temperature and upper ocean heat content. We use a Gaussian process emulator to interpolate the model output to an arbitrary parameter setting. We use Markov Chain Monte Carlo method to estimate the posterior probability distribution function (pdf) of these parameters. We explore the sensitivity of the results to prior assumptions about the parameters. In addition, we estimate the relative skill of different observations to constrain the parameters. We quantify the uncertainty in parameter estimates stemming from climate variability, model and observational errors. We explore the sensitivity of key decision-relevant climate projections to these parameters. We find that climate sensitivity and vertical ocean diffusivity estimates are consistent with previously published results. The climate sensitivity pdf is strongly affected by the prior assumptions, and by the scaling

  2. UNCERTAINTIES IN GALACTIC CHEMICAL EVOLUTION MODELS

    International Nuclear Information System (INIS)

    Côté, Benoit; Ritter, Christian; Herwig, Falk; O’Shea, Brian W.; Pignatari, Marco; Jones, Samuel; Fryer, Chris L.

    2016-01-01

    We use a simple one-zone galactic chemical evolution model to quantify the uncertainties generated by the input parameters in numerical predictions for a galaxy with properties similar to those of the Milky Way. We compiled several studies from the literature to gather the current constraints for our simulations regarding the typical value and uncertainty of the following seven basic parameters: the lower and upper mass limits of the stellar initial mass function (IMF), the slope of the high-mass end of the stellar IMF, the slope of the delay-time distribution function of Type Ia supernovae (SNe Ia), the number of SNe Ia per M ⊙ formed, the total stellar mass formed, and the final mass of gas. We derived a probability distribution function to express the range of likely values for every parameter, which were then included in a Monte Carlo code to run several hundred simulations with randomly selected input parameters. This approach enables us to analyze the predicted chemical evolution of 16 elements in a statistical manner by identifying the most probable solutions, along with their 68% and 95% confidence levels. Our results show that the overall uncertainties are shaped by several input parameters that individually contribute at different metallicities, and thus at different galactic ages. The level of uncertainty then depends on the metallicity and is different from one element to another. Among the seven input parameters considered in this work, the slope of the IMF and the number of SNe Ia are currently the two main sources of uncertainty. The thicknesses of the uncertainty bands bounded by the 68% and 95% confidence levels are generally within 0.3 and 0.6 dex, respectively. When looking at the evolution of individual elements as a function of galactic age instead of metallicity, those same thicknesses range from 0.1 to 0.6 dex for the 68% confidence levels and from 0.3 to 1.0 dex for the 95% confidence levels. The uncertainty in our chemical evolution model

  3. Final report on uncertainties in the detection, measurement, and analysis of selected features pertinent to deep geologic repositories

    International Nuclear Information System (INIS)

    1978-01-01

    Uncertainties with regard to many facets of repository site characterization have not yet been quantified. This report summarizes the state of knowledge of uncertainties in the measurement of porosity, hydraulic conductivity, and hydraulic gradient; uncertainties associated with various geophysical field techniques; and uncertainties associated with the effects of exploration and exploitation activities in bedded salt basins. The potential for seepage through a depository in bedded salt or shale is reviewed and, based upon the available data, generic values for the hydraulic conductivity and porosity of bedded salt and shale are proposed

  4. Sustainable aggregates production : green applications for aggregate by-products.

    Science.gov (United States)

    2015-06-01

    Increased emphasis in the construction industry on sustainability and recycling requires production of : aggregate gradations with lower dust (cleaner aggregates) and smaller maximum sizeshence, increased : amount of quarry by-products (QBs). QBs ...

  5. The Findings from the OECD/NEA/CSNI UMS (Uncertainty Method Study)

    International Nuclear Information System (INIS)

    D'Auria, F.; Glaeser, H.

    2013-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a 'best estimate' concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI (Committee on the Safety of Nuclear Installations) of OECD/NEA (Organization for Economic Cooperation and Development / Nuclear Energy Agency), has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges. A 'bifurcation' analysis was also performed by the same research group also providing another way of interpreting the high temperature peak calculated by two of the participants. (authors)

  6. Uncertainty and variability in computational and mathematical models of cardiac physiology.

    Science.gov (United States)

    Mirams, Gary R; Pathmanathan, Pras; Gray, Richard A; Challenor, Peter; Clayton, Richard H

    2016-12-01

    Mathematical and computational models of cardiac physiology have been an integral component of cardiac electrophysiology since its inception, and are collectively known as the Cardiac Physiome. We identify and classify the numerous sources of variability and uncertainty in model formulation, parameters and other inputs that arise from both natural variation in experimental data and lack of knowledge. The impact of uncertainty on the outputs of Cardiac Physiome models is not well understood, and this limits their utility as clinical tools. We argue that incorporating variability and uncertainty should be a high priority for the future of the Cardiac Physiome. We suggest investigating the adoption of approaches developed in other areas of science and engineering while recognising unique challenges for the Cardiac Physiome; it is likely that novel methods will be necessary that require engagement with the mathematics and statistics community. The Cardiac Physiome effort is one of the most mature and successful applications of mathematical and computational modelling for describing and advancing the understanding of physiology. After five decades of development, physiological cardiac models are poised to realise the promise of translational research via clinical applications such as drug development and patient-specific approaches as well as ablation, cardiac resynchronisation and contractility modulation therapies. For models to be included as a vital component of the decision process in safety-critical applications, rigorous assessment of model credibility will be required. This White Paper describes one aspect of this process by identifying and classifying sources of variability and uncertainty in models as well as their implications for the application and development of cardiac models. We stress the need to understand and quantify the sources of variability and uncertainty in model inputs, and the impact of model structure and complexity and their consequences for

  7. Droplet number uncertainties associated with CCN: an assessment using observations and a global model adjoint

    Directory of Open Access Journals (Sweden)

    R. H. Moore

    2013-04-01

    Full Text Available We use the Global Modelling Initiative (GMI chemical transport model with a cloud droplet parameterisation adjoint to quantify the sensitivity of cloud droplet number concentration to uncertainties in predicting CCN concentrations. Published CCN closure uncertainties for six different sets of simplifying compositional and mixing state assumptions are used as proxies for modelled CCN uncertainty arising from application of those scenarios. It is found that cloud droplet number concentrations (Nd are fairly insensitive to the number concentration (Na of aerosol which act as CCN over the continents (∂lnNd/∂lnNa ~10–30%, but the sensitivities exceed 70% in pristine regions such as the Alaskan Arctic and remote oceans. This means that CCN concentration uncertainties of 4–71% translate into only 1–23% uncertainty in cloud droplet number, on average. Since most of the anthropogenic indirect forcing is concentrated over the continents, this work shows that the application of Köhler theory and attendant simplifying assumptions in models is not a major source of uncertainty in predicting cloud droplet number or anthropogenic aerosol indirect forcing for the liquid, stratiform clouds simulated in these models. However, it does highlight the sensitivity of some remote areas to pollution brought into the region via long-range transport (e.g., biomass burning or from seasonal biogenic sources (e.g., phytoplankton as a source of dimethylsulfide in the southern oceans. Since these transient processes are not captured well by the climatological emissions inventories employed by current large-scale models, the uncertainties in aerosol-cloud interactions during these events could be much larger than those uncovered here. This finding motivates additional measurements in these pristine regions, for which few observations exist, to quantify the impact (and associated uncertainty of transient aerosol processes on cloud properties.

  8. The sensitivity analysis as a method of quantifying the degree of uncertainty

    Directory of Open Access Journals (Sweden)

    Manole Tatiana

    2013-01-01

    Full Text Available In this article the author relates about the uncertainty of any proposed investment or government policies. Taking in account this situation, it is necessary to do an analysis of proposed projects for implementation and from multiple choices to choose the project that is most advantageous. This is a general principle. The financial science provides to the researchers a set of tools with what we can identify the best project. The author aims to examine three projects that have the same features, applying them to various methods of financial analysis, such as net present value (NPV, the discount rate (SAR, recovery time (TR, additional income (VS and return on invested (RR. All these tools of financial analysis are in the cost-benefit analysis (CBA and have the aim to streamline the public money that are invested to achieve successful performance.

  9. CSAU (code scaling, applicability and uncertainty), a tool to prioritize advanced reactor research

    International Nuclear Information System (INIS)

    Wilson, G.E.; Boyack, B.E.

    1990-01-01

    Best Estimate computer codes have been accepted by the US Nuclear Regulatory Commission as an optional tool for performing safety analysis related to the licensing and regulation of current nuclear reactors producing commercial electrical power, providing their uncertainty is quantified. In support of this policy change, the NRC and its contractors and consultants have developed and demonstrated an uncertainty quantification methodology called CSAU. At the process level, the method is generic to any application which relies on best estimate computer code simulations to determine safe operating margins. The primary use of the CSAU methodology is to quantify safety margins for existing designs; however, the methodology can also serve an equally important role in advanced reactor research for plants not yet built. Applied early, during the period when alternate designs are being evaluated, the methodology can identify the relative importance of the sources of uncertainty in the knowledge of each plant behavior and, thereby, help prioritize the research needed to bring the new designs to fruition. This paper describes the CSAU methodology, at the generic process level, and provides the general principles whereby it may be applied to evaluations of advanced reactor designs. 9 refs., 1 fig., 1 tab

  10. Charm quark mass with calibrated uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Erler, Jens [Universidad Nacional Autonoma de Mexico, Instituto de Fisica, Mexico, DF (Mexico); Masjuan, Pere [Universitat Autonoma de Barcelona, Grup de Fisica Teorica, Departament de Fisica, Barcelona (Spain); Institut de Fisica d' Altes Energies (IFAE), The Barcelona Institute of Science and Technology (BIST), Barcelona (Spain); Spiesberger, Hubert [Johannes Gutenberg-Universitaet, PRISMA Cluster of Excellence, Institut fuer Physik, Mainz (Germany); University of Cape Town, Centre for Theoretical and Mathematical Physics and Department of Physics, Rondebosch (South Africa)

    2017-02-15

    We determine the charm quark mass m{sub c} from QCD sum rules of the moments of the vector current correlator calculated in perturbative QCD at O(α{sub s}{sup 3}). Only experimental data for the charm resonances below the continuum threshold are needed in our approach, while the continuum contribution is determined by requiring self-consistency between various sum rules, including the one for the zeroth moment. Existing data from the continuum region can then be used to bound the theoretic uncertainty. Our result is m{sub c}(m{sub c}) = 1272 ± 8 MeV for α{sub s}(M{sub Z}) = 0.1182, where the central value is in very good agreement with other recent determinations based on the relativistic sum rule approach. On the other hand, there is considerably less agreement regarding the theory dominated uncertainty and we pay special attention to the question how to quantify and justify it. (orig.)

  11. Estimate of the uncertainties in the relative risk of secondary malignant neoplasms following proton therapy and intensity-modulated photon therapy

    International Nuclear Information System (INIS)

    Fontenot, Jonas D; Bloch, Charles; Followill, David; Titt, Uwe; Newhauser, Wayne D

    2010-01-01

    Theoretical calculations have shown that proton therapy can reduce the incidence of radiation-induced secondary malignant neoplasms (SMN) compared with photon therapy for patients with prostate cancer. However, the uncertainties associated with calculations of SMN risk had not been assessed. The objective of this study was to quantify the uncertainties in projected risks of secondary cancer following contemporary proton and photon radiotherapies for prostate cancer. We performed a rigorous propagation of errors and several sensitivity tests to estimate the uncertainty in the ratio of relative risk (RRR) due to the largest contributors to the uncertainty: the radiation weighting factor for neutrons, the dose-response model for radiation carcinogenesis and interpatient variations in absorbed dose. The interval of values for the radiation weighting factor for neutrons and the dose-response model were derived from the literature, while interpatient variations in absorbed dose were taken from actual patient data. The influence of each parameter on a baseline RRR value was quantified. Our analysis revealed that the calculated RRR was insensitive to the largest contributors to the uncertainty. Uncertainties in the radiation weighting factor for neutrons, the shape of the dose-risk model and interpatient variations in therapeutic and stray doses introduced a total uncertainty of 33% to the baseline RRR calculation.

  12. A simplified analysis of uncertainty propagation in inherently controlled ATWS events

    International Nuclear Information System (INIS)

    Wade, D.C.

    1987-01-01

    The quasi static approach can be used to provide useful insight concerning the propagation of uncertainties in the inherent response to ATWS events. At issue is how uncertainties in the reactivity coefficients and in the thermal-hydraulics and materials properties propagate to yield uncertainties in the asymptotic temperatures attained upon inherent shutdown. The basic notion to be quantified is that many of the same physical phenomena contribute to both the reactivity increase of power reduction and the reactivity decrease of core temperature rise. Since these reactivities cancel by definition, a good deal of uncertainty cancellation must also occur of necessity. For example, if the Doppler coefficient is overpredicted, too large a positive reactivity insertion is predicted upon power reduction and collapse of the ΔT across the fuel pin. However, too large a negative reactivity is also predicted upon the compensating increase in the isothermal core average temperature - which includes the fuel Doppler effect

  13. Management and minimisation of uncertainties and errors in numerical aerodynamics results of the German collaborative project MUNA

    CERN Document Server

    Barnewitz, Holger; Fritz, Willy; Thiele, Frank

    2013-01-01

    This volume reports results from the German research initiative MUNA (Management and Minimization of Errors and Uncertainties in Numerical Aerodynamics), which combined development activities of the German Aerospace Center (DLR), German universities and German aircraft industry. The main objective of this five year project was the development of methods and procedures aiming at reducing various types of uncertainties that are typical of numerical flow simulations. The activities were focused on methods for grid manipulation, techniques for increasing the simulation accuracy, sensors for turbulence modelling, methods for handling uncertainties of the geometry and grid deformation as well as stochastic methods for quantifying aleatoric uncertainties.

  14. Calibration Uncertainties in the Droplet Measurement Technologies Cloud Condensation Nuclei Counter

    Science.gov (United States)

    Hibert, Kurt James

    Cloud condensation nuclei (CCN) serve as the nucleation sites for the condensation of water vapor in Earth's atmosphere and are important for their effect on climate and weather. The influence of CCN on cloud radiative properties (aerosol indirect effect) is the most uncertain of quantified radiative forcing changes that have occurred since pre-industrial times. CCN influence the weather because intrinsic and extrinsic aerosol properties affect cloud formation and precipitation development. To quantify these effects, it is necessary to accurately measure CCN, which requires accurate calibrations using a consistent methodology. Furthermore, the calibration uncertainties are required to compare measurements from different field projects. CCN uncertainties also aid the integration of CCN measurements with atmospheric models. The commercially available Droplet Measurement Technologies (DMT) CCN Counter is used by many research groups, so it is important to quantify its calibration uncertainty. Uncertainties in the calibration of the DMT CCN counter exist in the flow rate and supersaturation values. The concentration depends on the accuracy of the flow rate calibration, which does not have a large (4.3 %) uncertainty. The supersaturation depends on chamber pressure, temperature, and flow rate. The supersaturation calibration is a complex process since the chamber's supersaturation must be inferred from a temperature difference measurement. Additionally, calibration errors can result from the Kohler theory assumptions, fitting methods utilized, the influence of multiply-charged particles, and calibration points used. In order to determine the calibration uncertainties and the pressure dependence of the supersaturation calibration, three calibrations are done at each pressure level: 700, 840, and 980 hPa. Typically 700 hPa is the pressure used for aircraft measurements in the boundary layer, 840 hPa is the calibration pressure at DMT in Boulder, CO, and 980 hPa is the

  15. Quantification of uncertainty in gamma spectrometric analysis of food and environmental samples

    International Nuclear Information System (INIS)

    Yii Mei Wo; Zaharudin Ahmad; Norfaizal Mohamed

    2005-01-01

    Gamma Spectrometry is widely used to determine the activity of gamma-ray emitter radionuclide inside a sample. Reporting the activity of the measurement for a sample should not be a single value only but it shall be associated with a reasonable uncertainty value since disintegration of radionuclide is a random/spontaneous process. This paper will focus on how the uncertainty was estimated, quantified and calculated, when measuring the activity of Cs-134 and Cs-137 in food and Ra-226, Ra-228 and K-40 in the environmental samples. (Author)

  16. Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows

    Energy Technology Data Exchange (ETDEWEB)

    Templeton, Jeremy Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Blaylock, Myra L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Domino, Stefan P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hewson, John C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kumar, Pritvi Raj [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ling, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Najm, Habib N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ruiz, Anthony [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stewart, Alessia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wagner, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.

  17. Offering Strategy of a Flexibility Aggregator in a Balancing Market Using Asymmetric Block Offers

    DEFF Research Database (Denmark)

    Bobo, Lucien Ali; Delikaraoglou, Stefanos; Vespermann, Niklas

    2018-01-01

    scenarios are used to find optimal load-shifting offers under uncertainty. The problem is formulated as a stochastic mixed-integer linear program and can be solved with reasonable computational time. This work is taking place in the framework of the real-life demonstration project EcoGrid 2.0, which......In order to enable large-scale penetration of renewables with variable generation, new sources of flexibility have to be exploited in the power systems. Allowing asymmetric block offers (including response and rebound blocks) in balancing markets can facilitate the participation of flexibility...... aggregators and unlock load-shifting flexibility from, e.g., thermostatic loads. In this paper, we formulate an optimal offering strategy for a risk-averse flexibility aggregator participating in such a market. Using a price-taker approach, load flexibility characteristics and balancing market price forecast...

  18. Emotion and decision-making under uncertainty: Physiological arousal predicts increased gambling during ambiguity but not risk.

    Science.gov (United States)

    FeldmanHall, Oriel; Glimcher, Paul; Baker, Augustus L; Phelps, Elizabeth A

    2016-10-01

    Uncertainty, which is ubiquitous in decision-making, can be fractionated into known probabilities (risk) and unknown probabilities (ambiguity). Although research has illustrated that individuals more often avoid decisions associated with ambiguity compared to risk, it remains unclear why ambiguity is perceived as more aversive. Here we examine the role of arousal in shaping the representation of value and subsequent choice under risky and ambiguous decisions. To investigate the relationship between arousal and decisions of uncertainty, we measure skin conductance response-a quantifiable measure reflecting sympathetic nervous system arousal-during choices to gamble under risk and ambiguity. To quantify the discrete influences of risk and ambiguity sensitivity and the subjective value of each option under consideration, we model fluctuating uncertainty, as well as the amount of money that can be gained by taking the gamble. Results reveal that although arousal tracks the subjective value of a lottery regardless of uncertainty type, arousal differentially contributes to the computation of value-that is, choice-depending on whether the uncertainty is risky or ambiguous: Enhanced arousal adaptively decreases risk-taking only when the lottery is highly risky but increases risk-taking when the probability of winning is ambiguous (even after controlling for subjective value). Together, this suggests that the role of arousal during decisions of uncertainty is modulatory and highly dependent on the context in which the decision is framed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  19. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  20. Influence of Aggregate Wettability with Different Lithology Aggregates on Concrete Drying Shrinkage

    Directory of Open Access Journals (Sweden)

    Yuanchen Guo

    2015-01-01

    Full Text Available The correlation of the wettability of different lithology aggregates and the drying shrinkage of concrete materials is studied, and some influential factors such as wettability and wetting angle are analyzed. A mercury porosimeter is used to measure the porosities of different lithology aggregates accurately, and the pore size ranges that significantly affect the drying shrinkage of different lithology aggregate concretes are confirmed. The pore distribution curve of the different coarse aggregates is also measured through a statistical method, and the contact angle of different coarse aggregates and concrete is calculated according to the linear fitting relationship. Research shows that concrete strength is determined by aggregate strength. Aggregate wettability is not directly correlated with concrete strength, but wettability significantly affects concrete drying shrinkage. In all types’ pores, the greatest impacts on wettability are capillary pores and gel pores, especially for the pores of the size locating 2.5–50 nm and 50–100 nm two ranges.