WorldWideScience

Sample records for greatly reduce uncertainties

  1. Decision-making under great uncertainty

    International Nuclear Information System (INIS)

    Hansson, S.O.

    1992-01-01

    Five types of decision-uncertainty are distinguished: uncertainty of consequences, of values, of demarcation, of reliance, and of co-ordination. Strategies are proposed for each type of uncertainty. The general conclusion is that it is meaningful for decision theory to treat cases with greater uncertainty than the textbook case of 'decision-making under uncertainty'. (au)

  2. Quantifying uncertainties of seismic Bayesian inversion of Northern Great Plains

    Science.gov (United States)

    Gao, C.; Lekic, V.

    2017-12-01

    Elastic waves excited by earthquakes are the fundamental observations of the seismological studies. Seismologists measure information such as travel time, amplitude, and polarization to infer the properties of earthquake source, seismic wave propagation, and subsurface structure. Across numerous applications, seismic imaging has been able to take advantage of complimentary seismic observables to constrain profiles and lateral variations of Earth's elastic properties. Moreover, seismic imaging plays a unique role in multidisciplinary studies of geoscience by providing direct constraints on the unreachable interior of the Earth. Accurate quantification of uncertainties of inferences made from seismic observations is of paramount importance for interpreting seismic images and testing geological hypotheses. However, such quantification remains challenging and subjective due to the non-linearity and non-uniqueness of geophysical inverse problem. In this project, we apply a reverse jump Markov chain Monte Carlo (rjMcMC) algorithm for a transdimensional Bayesian inversion of continental lithosphere structure. Such inversion allows us to quantify the uncertainties of inversion results by inverting for an ensemble solution. It also yields an adaptive parameterization that enables simultaneous inversion of different elastic properties without imposing strong prior information on the relationship between them. We present retrieved profiles of shear velocity (Vs) and radial anisotropy in Northern Great Plains using measurements from USArray stations. We use both seismic surface wave dispersion and receiver function data due to their complementary constraints of lithosphere structure. Furthermore, we analyze the uncertainties of both individual and joint inversion of those two data types to quantify the benefit of doing joint inversion. As an application, we infer the variation of Moho depths and crustal layering across the northern Great Plains.

  3. Quantifying data worth toward reducing predictive uncertainty

    Science.gov (United States)

    Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C.

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement. Journal compilation ?? 2010 National Ground Water Association.

  4. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.

    Science.gov (United States)

    Kobayashi, Kenji; Hsu, Ming

    2017-07-19

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.

  5. Reducing prediction uncertainty of weather controlled systems

    NARCIS (Netherlands)

    Doeswijk, T.G.

    2007-01-01

    In closed agricultural systems the weather acts both as a disturbance and as a resource. By using weather forecasts in control strategies the effects of disturbances can be minimized whereas the resources can be utilized. In this situation weather forecast uncertainty and model based control are

  6. Reducing Reliability Uncertainties for Marine Renewable Energy

    Directory of Open Access Journals (Sweden)

    Sam D. Weller

    2015-11-01

    Full Text Available Technology Readiness Levels (TRLs are a widely used metric of technology maturity and risk for marine renewable energy (MRE devices. To-date, a large number of device concepts have been proposed which have reached the early validation stages of development (TRLs 1–3. Only a handful of mature designs have attained pre-commercial development status following prototype sea trials (TRLs 7–8. In order to navigate through the aptly named “valley of death” (TRLs 4–6 towards commercial realisation, it is necessary for new technologies to be de-risked in terms of component durability and reliability. In this paper the scope of the reliability assessment module of the DTOcean Design Tool is outlined including aspects of Tool integration, data provision and how prediction uncertainties are accounted for. In addition, two case studies are reported of mooring component fatigue testing providing insight into long-term component use and system design for MRE devices. The case studies are used to highlight how test data could be utilised to improve the prediction capabilities of statistical reliability assessment approaches, such as the bottom–up statistical method.

  7. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  8. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  9. Understanding and reducing statistical uncertainties in nebular abundance determinations

    Science.gov (United States)

    Wesson, R.; Stock, D. J.; Scicluna, P.

    2012-06-01

    Whenever observations are compared to theories, an estimate of the uncertainties associated with the observations is vital if the comparison is to be meaningful. However, many or even most determinations of temperatures, densities and abundances in photoionized nebulae do not quote the associated uncertainty. Those that do typically propagate the uncertainties using analytical techniques which rely on assumptions that generally do not hold. Motivated by this issue, we have developed Nebular Empirical Analysis Tool (NEAT), a new code for calculating chemical abundances in photoionized nebulae. The code carries out a standard analysis of lists of emission lines using long-established techniques to estimate the amount of interstellar extinction, calculate representative temperatures and densities, compute ionic abundances from both collisionally excited lines and recombination lines, and finally to estimate total elemental abundances using an ionization correction scheme. NEATuses a Monte Carlo technique to robustly propagate uncertainties from line flux measurements through to the derived abundances. We show that, for typical observational data, this approach is superior to analytic estimates of uncertainties. NEAT also accounts for the effect of upward biasing on measurements of lines with low signal-to-noise ratio, allowing us to accurately quantify the effect of this bias on abundance determinations. We find not only that the effect can result in significant overestimates of heavy element abundances derived from weak lines, but also that taking it into account reduces the uncertainty of these abundance determinations. Finally, we investigate the effect of possible uncertainties in R, the ratio of selective-to-total extinction, on abundance determinations. We find that the uncertainty due to this parameter is negligible compared to the statistical uncertainties due to typical line flux measurement uncertainties.

  10. Reducing the top quark mass uncertainty with jet grooming

    Science.gov (United States)

    Andreassen, Anders; Schwartz, Matthew D.

    2017-10-01

    The measurement of the top quark mass has large systematic uncertainties coming from the Monte Carlo simulations that are used to match theory and experiment. We explore how much that uncertainty can be reduced by using jet grooming procedures. Using the ATLAS A14 tunes of pythia, we estimate the uncertainty from the choice of tuning parameters in what is meant by the Monte Carlo mass to be around 530 MeV without any corrections. This uncertainty can be reduced by 60% to 200 MeV by calibrating to the W mass and by 70% to 140 MeV by additionally applying soft-drop jet grooming (or to 170 MeV using trimming). At e + e - colliders, the associated uncertainty is around 110 MeV, reducing to 50 MeV after calibrating to the W mass. By analyzing the tuning parameters, we conclude that the importance of jet grooming after calibrating to the W -mass is to reduce sensitivity to the underlying event.

  11. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    Science.gov (United States)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  12. Using prediction uncertainty analysis to design hydrologic monitoring networks: Example applications from the Great Lakes water availability pilot project

    Science.gov (United States)

    Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.

    2010-01-01

    The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.

  13. Attribute amnesia is greatly reduced with novel stimuli

    Directory of Open Access Journals (Sweden)

    Weijia Chen

    2017-11-01

    Full Text Available Attribute amnesia is the counterintuitive phenomenon where observers are unable to report a salient aspect of a stimulus (e.g., its colour or its identity immediately after the stimulus was presented, despite both attending to and processing the stimulus. Almost all previous attribute amnesia studies used highly familiar stimuli. Our study investigated whether attribute amnesia would also occur for unfamiliar stimuli. We conducted four experiments using stimuli that were highly familiar (colours or repeated animal images or that were unfamiliar to the observers (unique animal images. Our results revealed that attribute amnesia was present for both sets of familiar stimuli, colour (p < .001 and repeated animals (p = .001; but was greatly attenuated, and possibly eliminated, when the stimuli were unique animals (p = .02. Our data shows that attribute amnesia is greatly reduced for novel stimuli.

  14. Can agent based models effectively reduce fisheries management implementation uncertainty?

    Science.gov (United States)

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  15. Reducing uncertainty in geostatistical description with well testing pressure data

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, A.C.; He, Nanqun [Univ. of Tulsa, OK (United States); Oliver, D.S. [Chevron Petroleum Technology Company, La Habra, CA (United States)

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  16. The fertility response to the Great Recession in Europe and the United States: Structural economic conditions and perceived economic uncertainty

    Directory of Open Access Journals (Sweden)

    Chiara Ludovica Comolli

    2017-05-01

    Full Text Available Background: This study further develops Goldstein et al.'s (2013 analysis of the fertility response to the Great Recession in western economies. Objective: The purpose of this paper is to shed light on the fertility reaction to different indicators of the crisis. Beyond the structural labor market conditions, I investigate the dependence of fertility rates on economic policy uncertainty, government financial risk, and consumer confidence. Methods: Following Goldstein et al. (2013, I use log-log models to assess the elasticity of age-, parity-, and education-specific fertility rates to an array of indicators. Besides the inclusion of a wider set of explanatory variables, I include more recent data (2000−2013 and I enlarge the sample to 31 European countries plus the United States. Results: Fertility response to unemployment in some age- and parity-specific groups has been, in more recent years, larger than estimated by Goldstein et al. (2013. Female unemployment has also been significantly reducing fertility rates. Among uncertainty measures, the drop in consumer confidence is strongly related to fertility decline and in Southern European countries the fertility response to sovereign debt risk is comparable to that of unemployment. Economic policy uncertainty is negatively related to TFR even when controlling for unemployment. Conclusions: Theoretical and empirical investigation is needed to develop more tailored measures of economic and financial insecurity and their impact on birth rates. Contribution: The study shows the nonnegligible influence of economic and financial uncertainty on birth rates during the Great Recession in Western economies, over and above that of structural labor market conditions.

  17. An audit of the global carbon budget: identifying and reducing sources of uncertainty

    Science.gov (United States)

    Ballantyne, A. P.; Tans, P. P.; Marland, G.; Stocker, B. D.

    2012-12-01

    Uncertainties in our carbon accounting practices may limit our ability to objectively verify emission reductions on regional scales. Furthermore uncertainties in the global C budget must be reduced to benchmark Earth System Models that incorporate carbon-climate interactions. Here we present an audit of the global C budget where we try to identify sources of uncertainty for major terms in the global C budget. The atmospheric growth rate of CO2 has increased significantly over the last 50 years, while the uncertainty in calculating the global atmospheric growth rate has been reduced from 0.4 ppm/yr to 0.2 ppm/yr (95% confidence). Although we have greatly reduced global CO2 growth rate uncertainties, there remain regions, such as the Southern Hemisphere, Tropics and Arctic, where changes in regional sources/sinks will remain difficult to detect without additional observations. Increases in fossil fuel (FF) emissions are the primary factor driving the increase in global CO2 growth rate; however, our confidence in FF emission estimates has actually gone down. Based on a comparison of multiple estimates, FF emissions have increased from 2.45 ± 0.12 PgC/yr in 1959 to 9.40 ± 0.66 PgC/yr in 2010. Major sources of increasing FF emission uncertainty are increased emissions from emerging economies, such as China and India, as well as subtle differences in accounting practices. Lastly, we evaluate emission estimates from Land Use Change (LUC). Although relative errors in emission estimates from LUC are quite high (2 sigma ~ 50%), LUC emissions have remained fairly constant in recent decades. We evaluate the three commonly used approaches to estimating LUC emissions- Bookkeeping, Satellite Imagery, and Model Simulations- to identify their main sources of error and their ability to detect net emissions from LUC.; Uncertainties in Fossil Fuel Emissions over the last 50 years.

  18. Reducing Uncertainty: Implementation of Heisenberg Principle to Measure Company Performance

    Directory of Open Access Journals (Sweden)

    Anna Svirina

    2015-08-01

    Full Text Available The paper addresses the problem of uncertainty reduction in estimation of future company performance, which is a result of wide range of enterprise's intangible assets probable efficiency. To reduce this problem, the paper suggests to use quantum economy principles, i.e. implementation of Heisenberg principle to measure efficiency and potential of intangible assets of the company. It is proposed that for intangibles it is not possible to estimate both potential and efficiency at a certain time point. To provide a proof for these thesis, the data on resources potential and efficiency from mid-Russian companies was evaluated within deterministic approach, which did not allow to evaluate probability of achieving certain resource efficiency, and quantum approach, which allowed to estimate the central point around which the probable efficiency of resources in concentrated. Visualization of these approaches was performed by means of LabView software. It was proven that for tangible assets performance estimation a deterministic approach should be used; while for intangible assets the quantum approach allows better quality of future performance prediction. On the basis of these findings we proposed the holistic approach towards estimation of company resource efficiency in order to reduce uncertainty in modeling company performance.

  19. Reducing uncertainties in volumetric image based deformable organ registration

    International Nuclear Information System (INIS)

    Liang, J.; Yan, D.

    2003-01-01

    Applying volumetric image feedback in radiotherapy requires image based deformable organ registration. The foundation of this registration is the ability of tracking subvolume displacement in organs of interest. Subvolume displacement can be calculated by applying biomechanics model and the finite element method to human organs manifested on the multiple volumetric images. The calculation accuracy, however, is highly dependent on the determination of the corresponding organ boundary points. Lacking sufficient information for such determination, uncertainties are inevitable--thus diminishing the registration accuracy. In this paper, a method of consuming energy minimization was developed to reduce these uncertainties. Starting from an initial selection of organ boundary point correspondence on volumetric image sets, the subvolume displacement and stress distribution of the whole organ are calculated and the consumed energy due to the subvolume displacements is computed accordingly. The corresponding positions of the initially selected boundary points are then iteratively optimized to minimize the consuming energy under geometry and stress constraints. In this study, a rectal wall delineated from patient CT image was artificially deformed using a computer simulation and utilized to test the optimization. Subvolume displacements calculated based on the optimized boundary point correspondence were compared to the true displacements, and the calculation accuracy was thereby evaluated. Results demonstrate that a significant improvement on the accuracy of the deformable organ registration can be achieved by applying the consuming energy minimization in the organ deformation calculation

  20. Reducing the uncertainty in robotic machining by modal analysis

    Science.gov (United States)

    Alberdi, Iñigo; Pelegay, Jose Angel; Arrazola, Pedro Jose; Ørskov, Klaus Bonde

    2017-10-01

    The use of industrial robots for machining could lead to high cost and energy savings for the manufacturing industry. Machining robots offer several advantages respect to CNC machines such as flexibility, wide working space, adaptability and relatively low cost. However, there are some drawbacks that are preventing a widespread adoption of robotic solutions namely lower stiffness, vibration/chatter problems and lower accuracy and repeatability. Normally due to these issues conservative cutting parameters are chosen, resulting in a low material removal rate (MRR). In this article, an example of a modal analysis of a robot is presented. For that purpose the Tap-testing technology is introduced, which aims at maximizing productivity, reducing the uncertainty in the selection of cutting parameters and offering a stable process free from chatter vibrations.

  1. Mapping carbon flux uncertainty and selecting optimal locations for future flux towers in the Great Plains

    Science.gov (United States)

    Gu, Yingxin; Howard, Daniel M.; Wylie, Bruce K.; Zhang, Li

    2012-01-01

    Flux tower networks (e. g., AmeriFlux, Agriflux) provide continuous observations of ecosystem exchanges of carbon (e. g., net ecosystem exchange), water vapor (e. g., evapotranspiration), and energy between terrestrial ecosystems and the atmosphere. The long-term time series of flux tower data are essential for studying and understanding terrestrial carbon cycles, ecosystem services, and climate changes. Currently, there are 13 flux towers located within the Great Plains (GP). The towers are sparsely distributed and do not adequately represent the varieties of vegetation cover types, climate conditions, and geophysical and biophysical conditions in the GP. This study assessed how well the available flux towers represent the environmental conditions or "ecological envelopes" across the GP and identified optimal locations for future flux towers in the GP. Regression-based remote sensing and weather-driven net ecosystem production (NEP) models derived from different extrapolation ranges (10 and 50%) were used to identify areas where ecological conditions were poorly represented by the flux tower sites and years previously used for mapping grassland fluxes. The optimal lands suitable for future flux towers within the GP were mapped. Results from this study provide information to optimize the usefulness of future flux towers in the GP and serve as a proxy for the uncertainty of the NEP map.

  2. How incorporating more data reduces uncertainty in recovery predictions

    Energy Technology Data Exchange (ETDEWEB)

    Campozana, F.P.; Lake, L.W.; Sepehrnoori, K. [Univ. of Texas, Austin, TX (United States)

    1997-08-01

    From the discovery to the abandonment of a petroleum reservoir, there are many decisions that involve economic risks because of uncertainty in the production forecast. This uncertainty may be quantified by performing stochastic reservoir modeling (SRM); however, it is not practical to apply SRM every time the model is updated to account for new data. This paper suggests a novel procedure to estimate reservoir uncertainty (and its reduction) as a function of the amount and type of data used in the reservoir modeling. Two types of data are analyzed: conditioning data and well-test data. However, the same procedure can be applied to any other data type. Three performance parameters are suggested to quantify uncertainty. SRM is performed for the following typical stages: discovery, primary production, secondary production, and infill drilling. From those results, a set of curves is generated that can be used to estimate (1) the uncertainty for any other situation and (2) the uncertainty reduction caused by the introduction of new wells (with and without well-test data) into the description.

  3. Has the great recession and its aftermath reduced traffic fatalities?

    Science.gov (United States)

    Noland, Robert B; Zhou, Yuhan

    2017-01-01

    An analysis of state-level data from 1984 to 2014 provides evidence on the relationship between economic recessions and US traffic fatalities. While there are large reductions associated with decreases in household median income, other policy variables tend to have additional and in some cases, larger effects. An increase in the inequality of the income distribution, measured by the Gini index, has reduced traffic fatalities. Graduated licensing policies, cell phone laws, and motorcycle helmet requirements are all associated with reductions in fatalities. Other factors include a proxy for medical technology, and access to emergency medical services (based on the percent of vehicle miles traveled in rural areas); reductions in the latter accounted for a substantial reduction in fatalities and is likely another indicator of reduced economic activity. Changes in the road network, mainly increases in the percent of collector roads has increased fatalities. Population growth is associated with increased traffic fatalities and changes in age cohorts has a small negative effect. Overall, results suggest that there has been a beneficial impact on traffic fatalities from reduced economic activity, but various policies adopted by the states have also reduced traffic fatalities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. MRS role in reducing technical uncertainties in geological disposal

    International Nuclear Information System (INIS)

    Ramspott, L.D.

    1990-06-01

    A high-level nuclear waste repository has inherent technical uncertainty due to its first-of-a-kind nature and the unprecedented time over which it must function. Three possible technical modifications to the currently planned US high-level nuclear waste system are reviewed in this paper. These modifications would be facilitated by inclusion of a monitored retrievable storage (MRS) in the system. The modifications are (1) an underground MRS at Yucca Mountain, (2) a phased repository, and (3) a ''cold'' repository. These modifications are intended to enhance scientific confidence that a repository system would function satisfactorily despite technical uncertainty. 12 refs

  5. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    A weakness of global sensitivity and uncertainty analysis methodologies is the often subjective definition of prior parameter probability distributions, especially ... The reservoir representing the central part of the wetland, where flood waters separate into several independent distributaries, is a keystone area within the model.

  6. Combining observations and models to reduce uncertainty in the cloud response to global warming

    Science.gov (United States)

    Norris, J. R.; Myers, T.; Chellappan, S.

    2017-12-01

    Currently there is large uncertainty on how subtropical low-level clouds will respond to global warming and whether they will act as a positive feedback or negative feedback. Global climate models substantially agree on what changes in atmospheric structure and circulation will occur with global warming but greatly disagree over how clouds will respond to these changes in structure and circulation. An examination of models with the most realistic simulations of low-level cloudiness indicates that the model cloud response to atmospheric changes associated with global warming is quantitatively similar to the model cloud response to atmospheric changes at interannual time scales. For these models, the cloud response to global warming predicted by multilinear regression using coefficients derived from interannual time scales is quantitatively similar to the cloud response to global warming directly simulated by the model. Since there is a large spread among cloud response coefficients even among models with the most realistic cloud simulations, substitution of coefficients derived from satellite observations reduces the uncertainty range of the low-level cloud feedback. Increased sea surface temperature associated with global warming acts to reduce low-level cloudiness, which is partially offset by increased lower tropospheric stratification that acts to enhance low-level cloudiness. Changes in free-tropospheric relative humidity, subsidence, and horizontal advection have only a small impact on low-level cloud. The net reduction in subtropical low-level cloudiness increases absorption of solar radiation by the climate system, thus resulting in a weak positive feedback.

  7. One Strategy for Reducing Uncertainty in Climate Change Communications

    Science.gov (United States)

    Romm, J.

    2011-12-01

    Future impacts of climate change are invariably presented with a very wide range of impacts reflecting two different sets of uncertainties. The first concerns our uncertainty about precisely how much greenhouse gas emissions humanity will emit into the atmosphere. The second concerns our uncertainty about precisely what impact those emissions will have on the climate. By failing to distinguish between these two types of uncertainties, climate scientists have not clearly explained to the public and policymakers what the scientific literature suggests is likely to happen if we don't substantially alter our current emissions path. Indeed, much of climate communications has been built around describing the range of impacts from emissions paths that are increasingly implausible given political and technological constraints, such as a stabilization at 450 or 550 parts per million atmospheric of carbon dioxide. For the past decade, human emissions of greenhouse gases have trended near the worst-case scenarios of the Intergovernmental Panel on Climate Change, emissions paths that reach 800 ppm or even 1000 ppm. The current policies of the two biggest emitters, the United States and China, coupled with the ongoing failure of international negotiations to come to an agreement on restricting emissions, suggests that recent trends will continue for the foreseeable future. This in turn suggests that greater clarity in climate change communications could be achieved by more clearly explaining to the public what the scientific literature suggests the range of impacts are for our current high emissions path. This also suggests that more focus should be given in the scientific literature to better constraining the range of impacts from the high emissions scenarios.

  8. Explaining Delusions: Reducing Uncertainty Through Basic and Computational Neuroscience.

    Science.gov (United States)

    Feeney, Erin J; Groman, Stephanie M; Taylor, Jane R; Corlett, Philip R

    2017-03-01

    Delusions, the fixed false beliefs characteristic of psychotic illness, have long defied understanding despite their response to pharmacological treatments (e.g., D2 receptor antagonists). However, it can be challenging to discern what makes beliefs delusional compared with other unusual or erroneous beliefs. We suggest mapping the putative biology to clinical phenomenology with a cognitive psychology of belief, culminating in a teleological approach to beliefs and brain function supported by animal and computational models. We argue that organisms strive to minimize uncertainty about their future states by forming and maintaining a set of beliefs (about the organism and the world) that are robust, but flexible. If uncertainty is generated endogenously, beliefs begin to depart from consensual reality and can manifest into delusions. Central to this scheme is the notion that formal associative learning theory can provide an explanation for the development and persistence of delusions. Beliefs, in animals and humans, may be associations between representations (e.g., of cause and effect) that are formed by minimizing uncertainty via new learning and attentional allocation. Animal research has equipped us with a deep mechanistic basis of these processes, which is now being applied to delusions. This work offers the exciting possibility of completing revolutions of translation, from the bedside to the bench and back again. The more we learn about animal beliefs, the more we may be able to apply to human beliefs and their aberrations, enabling a deeper mechanistic understanding. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  9. Reducing the uncertainty in the fidelity of seismic imaging results

    Science.gov (United States)

    Zhou, H. W.; Zou, Z.

    2017-12-01

    A key aspect in geoscientific inversion is quantifying the quality of the results. In seismic imaging, we must quantify the uncertainty of every imaging result based on field data, because data noise and methodology limitations may produce artifacts. Detection of artifacts is therefore an important aspect in uncertainty quantification in geoscientific inversion. Quantifying the uncertainty of seismic imaging solutions means assessing their fidelity, which defines the truthfulness of the imaged targets in terms of their resolution, position error and artifact. Key challenges to achieving the fidelity of seismic imaging include: (1) Difficulty to tell signal from artifact and noise; (2) Limitations in signal-to-noise ratio and seismic illumination; and (3) The multi-scale nature of the data space and model space. Most seismic imaging studies of the Earth's crust and mantle have employed inversion or modeling approaches. Though they are in opposite directions of mapping between the data space and model space, both inversion and modeling seek the best model to minimize the misfit in the data space, which unfortunately is not the output space. The fact that the selection and uncertainty of the output model are not judged in the output space has exacerbated the nonuniqueness problem for inversion and modeling. In contrast, the practice in exploration seismology has long established a two-fold approach of seismic imaging: Using velocity modeling building to establish the long-wavelength reference velocity models, and using seismic migration to map the short-wavelength reflectivity structures. Most interestingly, seismic migration maps the data into an output space called imaging space, where the output reflection images of the subsurface are formed based on an imaging condition. A good example is the reverse time migration, which seeks the reflectivity image as the best fit in the image space between the extrapolation of time-reversed waveform data and the prediction

  10. DUALISM OF GEOSTRATEGIC PROSPECTS OF GREAT BRITAIN IN THE MODERN SYSTEM OF GLOBAL UNCERTAINTIES

    Directory of Open Access Journals (Sweden)

    Alexey N. Yeletsky

    2013-01-01

    Full Text Available Peculiarities of a modern position of Great Britain in the global economy are analysed. Its role as one of the local centres of influence in the European Union is emphasized. «Special relationship» between England and the United States in the context of formation of a new «Anglo-Saxon empire» is examined. Particular attention is paid to the key role of Great Britain in the alliance of English-speaking powers.

  11. Reducing uncertainty in nitrogen budgets for African livestock systems

    International Nuclear Information System (INIS)

    Rufino, M C; Brandt, P; Herrero, M; Butterbach-Bahl, K

    2014-01-01

    Livestock is poorly represented in N budgets for the African continent although some studies have examined livestock-related N flows at different levels. Livestock plays an important role in N cycling and therefore on N budgets including livestock-related flows. This study reviews the literature on N budgets for Africa to identify factors contributing to uncertainties. Livestock densities are usually modelled because of the lack of observational spatial data. Even though feed availability and quality varies across seasons, most studies use constant livestock excretion rates, and excreta are usually assumed to be uniformly distributed onto the land. Major uncertainties originate in the fraction of manure managed, and emission factors which may not reflect the situation of Africa. N budgets use coarse assumptions on production, availability, and use of crop residues as livestock feed. No flows between croplands–livestock and rangelands reflect the lack of data. Joint efforts are needed for spatial data collection of livestock data, crowdsourcing appears to be a promising option. The focus of the assessment of N budgets must go beyond croplands to include livestock and crop–livestock flows. We propose a nested systems definition of livestock systems to link local, regional level, and continental level and to increase the usefulness of point measurements of N losses. Scientists working at all levels should generate data to calibrate process-based models. Measurements in the field should not only concentrate on greenhouse gas emissions, but need to include crop and livestock production measurements, soil stock changes and other N loss pathways such as leaching, run-off and volatilization to assess management practices and trade-offs. Compared to the research done in other continents on N flows in livestock systems, there are few data for Africa, and therefore concerted effort will be needed to generate sufficient data for modelling. (paper)

  12. Reducing the uncertainty of the primary damage production in Fe

    International Nuclear Information System (INIS)

    Bjorkas, C.; Nordlund, K.

    2007-01-01

    Full text of publication follows: One of the key questions for understanding neutron irradiation damage buildup in fission and fusion reactor steels is knowing the primary damage state produced by neutron-induced atomic recoils in Fe. Supporting this is our recent study revealing that the initial damage in Fe 0.9 Cr 0.1 is essentially the same as in pure Fe [1]. In spite of decades of study, the question of what the amount and distribution of defects in Fe is, has remained highly unclear. Different computer simulations modules have given a good qualitative understanding of the cascade development [1,2]. However, quantitative differences of more than a factor of three have remained in the predicted clustered defect production numbers [2]. The disagreements between the potentials pose problems for finding a reliable predictive model for the behavior of Fe under irradiation. In this study we analyze the initial damage as predicted by three recent interatomic potentials for Fe. These are well suited for a comparison because they have very different physical motivations and functional forms, but are comparable in overall quality and in particular reproduce the energetics of interstitials in different configurations well. The potentials are those by Ackland and Mendelev et al. (AMS) [3], the 'magnetic' potential by Dudarev and Derlet (DD) [4] and the Tersoff-like analytical potential by Mueller, Erhart and Albe (MEA) [5]. The DD and MEA potentials were modified by us to describe high-energy repulsive interactions well. All potentials were then used in recoil collision cascade simulations carried out and analyzed in exactly the same manner for all potentials. Analysis of the resulting damage showed a much smaller uncertainty regarding the damage production than that of previous potentials. The total defect production numbers essentially agree within the statistical uncertainty for the three potentials. Some differences remains regarding the defect clustered fractions, but

  13. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    Science.gov (United States)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  14. Synchronous Sounds Enhance Visual Sensitivity without Reducing Target Uncertainty

    Directory of Open Access Journals (Sweden)

    Yi-Chuan Chen

    2011-10-01

    Full Text Available We examined the crossmodal effect of the presentation of a simultaneous sound on visual detection and discrimination sensitivity using the equivalent noise paradigm (Dosher & Lu, 1998. In each trial, a tilted Gabor patch was presented in either the first or second of two intervals consisting of dynamic 2D white noise with one of seven possible contrast levels. The results revealed that the sensitivity of participants' visual detection and discrimination performance were both enhanced by the presentation of a simultaneous sound, though only close to the noise level at which participants' target contrast thresholds started to increase with the increasing noise contrast. A further analysis of the psychometric function at this noise level revealed that the increase in sensitivity could not be explained by the reduction of participants' uncertainty regarding the onset time of the visual target. We suggest that this crossmodal facilitatory effect may be accounted for by perceptual enhancement elicited by a simultaneously-presented sound, and that the crossmodal facilitation was easier to observe when the visual system encountered a level of noise that happened to be close to the level of internal noise embedded within the system.

  15. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  16. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  17. Great Tits (Parus major) reduce caterpillar damage in commercial apple orchards

    NARCIS (Netherlands)

    Mols, C.M.M.; Visser, M.E.

    2007-01-01

    Alternative ways to control caterpillar pests and reduce the use of pesticides in apple orchards are in the interest of the environment, farmers and the public. Great tits have already been shown to reduce damage under high caterpillar density when breeding in nest boxes in an experimental apple

  18. Impact of Climate Change on high and low flows across Great Britain: a temporal analysis and uncertainty assessment.

    Science.gov (United States)

    Beevers, Lindsay; Collet, Lila

    2017-04-01

    Over the past decade there have been significant challenges to water management posed by both floods and droughts. In the UK, since 2000 flooding has caused over £5Bn worth of damage, and direct costs from the recent drought (2011-12) are estimated to be between £70-165M, arising from impacts on public and industrial water supply. Projections of future climate change suggest an increase in temperature and precipitation trends which may exacerbate the frequency and severity of such hazards, but there is significant uncertainty associated with these projections. It thus becomes urgent to assess the possible impact of these changes on extreme flows and evaluate the uncertainties related to these projections, particularly changes in the seasonality of such hazards. This paper aims to assess the changes in seasonality of peak and low flows across Great Britain as a result of climate change. It is based on the Future Flow database; an 11-member ensemble of transient river flow projections from January 1951 to December 2098. We analyse the daily river flow over the baseline (1961-1990) and the 2080s (2069-2098) for 281 gauging stations. For each ensemble member, annual maxima (AMAX) and minima (AMIN) are extracted for both time periods for each gauging station. The month of the year the AMAX and AMIN occur respectively are recorded for each of the 30 years in the past and the future time periods. The uncertainty of the AMAX and AMIN occurrence temporally (monthly) is assessed across the 11 ensemble members, as well as the changes to this temporal signal between the baseline and the 2080s. Ultimately, this work gives a national picture (spatially) of high and low flows occurrence temporally and allows the assessment of possible changes in hydrological dynamics as a result of climate change in a statistical framework. Results will quantify the uncertainty related to the Climate Model parameters which are cascaded into the modelling chain. This study highlights the issues

  19. Accessing the uncertainties of seismic velocity and anisotropy structure of Northern Great Plains using a transdimensional Bayesian approach

    Science.gov (United States)

    Gao, C.; Lekic, V.

    2017-12-01

    Seismic imaging utilizing complementary seismic data provides unique insight on the formation, evolution and current structure of continental lithosphere. While numerous efforts have improved the resolution of seismic structure, the quantification of uncertainties remains challenging due to the non-linearity and the non-uniqueness of geophysical inverse problem. In this project, we use a reverse jump Markov chain Monte Carlo (rjMcMC) algorithm to incorporate seismic observables including Rayleigh and Love wave dispersion, Ps and Sp receiver function to invert for shear velocity (Vs), compressional velocity (Vp), density, and radial anisotropy of the lithospheric structure. The Bayesian nature and the transdimensionality of this approach allow the quantification of the model parameter uncertainties while keeping the models parsimonious. Both synthetic test and inversion of actual data for Ps and Sp receiver functions are performed. We quantify the information gained in different inversions by calculating the Kullback-Leibler divergence. Furthermore, we explore the ability of Rayleigh and Love wave dispersion data to constrain radial anisotropy. We show that when multiple types of model parameters (Vsv, Vsh, and Vp) are inverted simultaneously, the constraints on radial anisotropy are limited by relatively large data uncertainties and trade-off strongly with Vp. We then perform joint inversion of the surface wave dispersion (SWD) and Ps, Sp receiver functions, and show that the constraints on both isotropic Vs and radial anisotropy are significantly improved. To achieve faster convergence of the rjMcMC, we propose a progressive inclusion scheme, and invert SWD measurements and receiver functions from about 400 USArray stations in the Northern Great Plains. We start by only using SWD data due to its fast convergence rate. We then use the average of the ensemble as a starting model for the joint inversion, which is able to resolve distinct seismic signatures of

  20. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da; Samaan, Nader A.; Makarov, Yuri V.; Huang, Zhenyu

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  1. Estabilishing requirements for the next generation of pressurized water reactors--reducing the uncertainty

    International Nuclear Information System (INIS)

    Chernock, W.P.; Corcoran, W.R.; Rasin, W.H.; Stahlkopf, K.E.

    1987-01-01

    The Electric Power Research Institute is managing a major effort to establish requirements for the next generation of U.S. light water reactors. This effort is the vital first step in preserving the viability of the nuclear option to contribute to meet U.S. national electric power capacity needs in the next century. Combustion Engineering, Inc. and Duke Power Company formed a team to participate in the EPRI program which is guided by a Utility Steering committee consisting of experienced utility technical executives. A major thrust of the program is to reduce the uncertainties which would be faced by the utility executives in choosing the nuclear option. The uncertainties to be reduced include those related to safety, economic, operational, and regulatory aspects of advanced light water reactors. This paper overviews the Requirements Document program as it relates to the U.S. Advanced Light Water Reactor (ALWR) effort in reducing these uncertainties and reports the status of efforts to establish requirements for the next generation of pressurized water reactors. It concentrates on progress made in reducing the uncertainties which would deter selection of the nuclear option for contributing to U.S. national electric power capacity needs in the next century and updates previous reports in the same area. (author)

  2. Using performance indicators to reduce cost uncertainty of China's CO2 mitigation goals

    International Nuclear Information System (INIS)

    Xu, Yuan

    2013-01-01

    Goals on absolute emissions and intensity play key roles in CO 2 mitigation. However, like cap-and-trade policies with price uncertainty, they suffer from significant uncertainty in abatement costs. This article examines whether an indicator could be established to complement CO 2 mitigation goals and help reduce cost uncertainty with a particular focus on China. Performance indicators on CO 2 emissions per unit of energy consumption could satisfy three criteria: compared with the mitigation goals, (i) they are more closely associated with active mitigation efforts and (ii) their baselines have more stable projections from historical trajectories. (iii) Their abatement costs are generally higher than other mitigation methods, particularly energy efficiency and conservation. Performance indicators could be used in the following way: if a CO 2 goal on absolute emissions or intensity is attained, the performance indicator should still reach a lower threshold as a cost floor. If the goal cannot be attained, an upper performance threshold should be achieved as a cost ceiling. The narrower cost uncertainty may encourage wider and greater mitigation efforts. - Highlights: ► CO 2 emissions per unit of energy consumption could act as performance indicators. ► Performance indicators are more closely related to active mitigation activities. ► Performance indicators have more stable historical trajectories. ► Abatement costs are higher for performance indicators than for other activities. ► Performance thresholds could reduce the cost uncertainty of CO 2 mitigation goals.

  3. Optimal portfolio design to reduce climate-related conservation uncertainty in the Prairie Pothole Region.

    Science.gov (United States)

    Ando, Amy W; Mallory, Mindy L

    2012-04-24

    Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit-cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change-induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area.

  4. Optimal portfolio design to reduce climate-related conservation uncertainty in the Prairie Pothole Region

    Science.gov (United States)

    Ando, Amy W.; Mallory, Mindy L.

    2012-01-01

    Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit–cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change–induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area. PMID:22451914

  5. Great tits (Parus major reduce caterpillar damage in commercial apple orchards.

    Directory of Open Access Journals (Sweden)

    Christel M M Mols

    Full Text Available Alternative ways to control caterpillar pests and reduce the use of pesticides in apple orchards are in the interest of the environment, farmers and the public. Great tits have already been shown to reduce damage under high caterpillar density when breeding in nest boxes in an experimental apple orchard. We tested whether this reduction also occurs under practical conditions of Integrated Pest Management (IPM, as well as Organic Farming (OF, by setting up an area with nest boxes while leaving a comparable area as a control within 12 commercial orchards. We showed that in IPM orchards, but not in OF orchards, in the areas with breeding great tits, apples had 50% of the caterpillar damage of the control areas. Offering nest boxes to attract insectivorous passerines in orchards can thus lead to more limited pesticide use, thereby adding to the natural biological diversity in an agricultural landscape, while also being economically profitable to the fruit growers.

  6. Great tits (Parus major) reduce caterpillar damage in commercial apple orchards.

    Science.gov (United States)

    Mols, Christel M M; Visser, Marcel E

    2007-02-07

    Alternative ways to control caterpillar pests and reduce the use of pesticides in apple orchards are in the interest of the environment, farmers and the public. Great tits have already been shown to reduce damage under high caterpillar density when breeding in nest boxes in an experimental apple orchard. We tested whether this reduction also occurs under practical conditions of Integrated Pest Management (IPM), as well as Organic Farming (OF), by setting up an area with nest boxes while leaving a comparable area as a control within 12 commercial orchards. We showed that in IPM orchards, but not in OF orchards, in the areas with breeding great tits, apples had 50% of the caterpillar damage of the control areas. Offering nest boxes to attract insectivorous passerines in orchards can thus lead to more limited pesticide use, thereby adding to the natural biological diversity in an agricultural landscape, while also being economically profitable to the fruit growers.

  7. A hydrogen-oxidizing, Fe(III)-reducing microorganism from the Great Bay estuary, New Hampshire

    Science.gov (United States)

    Caccavo, F.; Blakemore, R.P.; Lovley, D.R.

    1992-01-01

    A dissimilatory Fe(III)- and Mn(IV)-reducing bacterium was isolated from bottom sediments of the Great Bay estuary, New Hampshire. The isolate was a facultatively anaerobic gram-negative rod which did not appear to fit into any previously described genus. It was temporarily designated strain BrY. BrY grew anaerobically in a defined medium with hydrogen or lactate as the electron donor and Fe(III) as the electron acceptor. BrY required citrate, fumarate, or malate as a carbon source for growth on H2 and Fe(III). With Fe(III) as the sole electron acceptor, BrY metabolized hydrogen to a minimum threshold at least 60-fold lower than the threshold reported for pure cultures of sulfate reducers. This finding supports the hypothesis that when Fe(III) is available, Fe(III) reducers can outcompete sulfate reducers for electron donors. Lactate was incompletely oxidized to acetate and carbon dioxide with Fe(III) as the electron acceptor. Lactate oxidation was also coupled to the reduction of Mn(IV), U(VI), fumarate, thiosulfate, or trimethylamine n-oxide under anaerobic conditions. BrY provides a model for how enzymatic metal reduction by respiratory metal-reducing microorganisms has the potential to contribute to the mobilization of iron and trace metals and to the immobilization of uranium in sediments of Great Bay Estuary.

  8. The National Ecosystem Services Classification System: A Framework for Identifying and Reducing Relevant Uncertainties

    Science.gov (United States)

    Rhodes, C. R.; Sinha, P.; Amanda, N.

    2013-12-01

    In recent years the gap between what scientists know and what policymakers should appreciate in environmental decision making has received more attention, as the costs of the disconnect have become more apparent to both groups. Particularly for water-related policies, the EPA's Office of Water has struggled with benefit estimates held low by the inability to quantify ecological and economic effects that theory, modeling, and anecdotal or isolated case evidence suggest may prove to be larger. Better coordination with ecologists and hydrologists is being explored as a solution. The ecosystem services (ES) concept now nearly two decades old links ecosystem functions and processes to the human value system. But there remains no clear mapping of which ecosystem goods and services affect which individual or economic values. The National Ecosystem Services Classification System (NESCS, 'nexus') project brings together ecologists, hydrologists, and social scientists to do this mapping for aquatic and other ecosystem service-generating systems. The objective is to greatly reduce the uncertainty in water-related policy making by mapping and ultimately quantifying the various functions and products of aquatic systems, as well as how changes to aquatic systems impact the human economy and individual levels of non-monetary appreciation for those functions and products. Primary challenges to fostering interaction between scientists, social scientists, and policymakers are lack of a common vocabulary, and the need for a cohesive comprehensive framework that organizes concepts across disciplines and accommodates scientific data from a range of sources. NESCS builds the vocabulary and the framework so both may inform a scalable transdisciplinary policy-making application. This talk presents for discussion the process and progress in developing both this vocabulary and a classifying framework capable of bridging the gap between a newer but existing ecosystem services classification

  9. Calculating salt loads to Great Salt Lake and the associated uncertainties for water year 2013; updating a 48 year old standard

    Science.gov (United States)

    Shope, Christopher L.; Angeroth, Cory E.

    2015-01-01

    Effective management of surface waters requires a robust understanding of spatiotemporal constituent loadings from upstream sources and the uncertainty associated with these estimates. We compared the total dissolved solids loading into the Great Salt Lake (GSL) for water year 2013 with estimates of previously sampled periods in the early 1960s.We also provide updated results on GSL loading, quantitatively bounded by sampling uncertainties, which are useful for current and future management efforts. Our statistical loading results were more accurate than those from simple regression models. Our results indicate that TDS loading to the GSL in water year 2013 was 14.6 million metric tons with uncertainty ranging from 2.8 to 46.3 million metric tons, which varies greatly from previous regression estimates for water year 1964 of 2.7 million metric tons. Results also indicate that locations with increased sampling frequency are correlated with decreasing confidence intervals. Because time is incorporated into the LOADEST models, discrepancies are largely expected to be a function of temporally lagged salt storage delivery to the GSL associated with terrestrial and in-stream processes. By incorporating temporally variable estimates and statistically derived uncertainty of these estimates,we have provided quantifiable variability in the annual estimates of dissolved solids loading into the GSL. Further, our results support the need for increased monitoring of dissolved solids loading into saline lakes like the GSL by demonstrating the uncertainty associated with different levels of sampling frequency.

  10. A new approach to reduce uncertainties in space radiation cancer risk predictions.

    Directory of Open Access Journals (Sweden)

    Francis A Cucinotta

    Full Text Available The prediction of space radiation induced cancer risk carries large uncertainties with two of the largest uncertainties being radiation quality and dose-rate effects. In risk models the ratio of the quality factor (QF to the dose and dose-rate reduction effectiveness factor (DDREF parameter is used to scale organ doses for cosmic ray proton and high charge and energy (HZE particles to a hazard rate for γ-rays derived from human epidemiology data. In previous work, particle track structure concepts were used to formulate a space radiation QF function that is dependent on particle charge number Z, and kinetic energy per atomic mass unit, E. QF uncertainties where represented by subjective probability distribution functions (PDF for the three QF parameters that described its maximum value and shape parameters for Z and E dependences. Here I report on an analysis of a maximum QF parameter and its uncertainty using mouse tumor induction data. Because experimental data for risks at low doses of γ-rays are highly uncertain which impacts estimates of maximum values of relative biological effectiveness (RBEmax, I developed an alternate QF model, denoted QFγAcute where QFs are defined relative to higher acute γ-ray doses (0.5 to 3 Gy. The alternate model reduces the dependence of risk projections on the DDREF, however a DDREF is still needed for risk estimates for high-energy protons and other primary or secondary sparsely ionizing space radiation components. Risk projections (upper confidence levels (CL for space missions show a reduction of about 40% (CL∼50% using the QFγAcute model compared the QFs based on RBEmax and about 25% (CL∼35% compared to previous estimates. In addition, I discuss how a possible qualitative difference leading to increased tumor lethality for HZE particles compared to low LET radiation and background tumors remains a large uncertainty in risk estimates.

  11. Reducing structural uncertainty in conceptual hydrological modeling in the semi-arid Andes

    Science.gov (United States)

    Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.

    2014-10-01

    The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modeling of a meso-scale Andean catchment (1515 km2) over a 30 year period (1982-2011). The modeling process was decomposed into six model-building decisions related to the following aspects of the system behavior: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modeling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain 8 model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modeling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.

  12. Reducing structural uncertainty in conceptual hydrological modelling in the semi-arid Andes

    Science.gov (United States)

    Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.

    2015-05-01

    The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modelling of a mesoscale Andean catchment (1515 km2) over a 30-year period (1982-2011). The modelling process was decomposed into six model-building decisions related to the following aspects of the system behaviour: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modelling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional (4-D) space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain eight model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modelling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.

  13. Integrating geophysics and hydrology for reducing the uncertainty of groundwater model predictions and improved prediction performance

    DEFF Research Database (Denmark)

    Christensen, Nikolaj Kruse; Christensen, Steen; Ferre, Ty

    the integration of geophysical data in the construction of a groundwater model increases the prediction performance. We suggest that modelers should perform a hydrogeophysical “test-bench” analysis of the likely value of geophysics data for improving groundwater model prediction performance before actually...... and the resulting predictions can be compared with predictions from the ‘true’ model. By performing this analysis we expect to give the modeler insight into how the uncertainty of model-based prediction can be reduced.......A major purpose of groundwater modeling is to help decision-makers in efforts to manage the natural environment. Increasingly, it is recognized that both the predictions of interest and their associated uncertainties should be quantified to support robust decision making. In particular, decision...

  14. Reducing uncertainty in dust monitoring to detect aeolian sediment transport responses to land cover change

    Science.gov (United States)

    Webb, N.; Chappell, A.; Van Zee, J.; Toledo, D.; Duniway, M.; Billings, B.; Tedela, N.

    2017-12-01

    Anthropogenic land use and land cover change (LULCC) influence global rates of wind erosion and dust emission, yet our understanding of the magnitude of the responses remains poor. Field measurements and monitoring provide essential data to resolve aeolian sediment transport patterns and assess the impacts of human land use and management intensity. Data collected in the field are also required for dust model calibration and testing, as models have become the primary tool for assessing LULCC-dust cycle interactions. However, there is considerable uncertainty in estimates of dust emission due to the spatial variability of sediment transport. Field sampling designs are currently rudimentary and considerable opportunities are available to reduce the uncertainty. Establishing the minimum detectable change is critical for measuring spatial and temporal patterns of sediment transport, detecting potential impacts of LULCC and land management, and for quantifying the uncertainty of dust model estimates. Here, we evaluate the effectiveness of common sampling designs (e.g., simple random sampling, systematic sampling) used to measure and monitor aeolian sediment transport rates. Using data from the US National Wind Erosion Research Network across diverse rangeland and cropland cover types, we demonstrate how only large changes in sediment mass flux (of the order 200% to 800%) can be detected when small sample sizes are used, crude sampling designs are implemented, or when the spatial variation is large. We then show how statistical rigour and the straightforward application of a sampling design can reduce the uncertainty and detect change in sediment transport over time and between land use and land cover types.

  15. Subpixel edge localization with reduced uncertainty by violating the Nyquist criterion

    Science.gov (United States)

    Heidingsfelder, Philipp; Gao, Jun; Wang, Kun; Ott, Peter

    2014-12-01

    In this contribution, the extent to which the Nyquist criterion can be violated in optical imaging systems with a digital sensor, e.g., a digital microscope, is investigated. In detail, we analyze the subpixel uncertainty of the detected position of a step edge, the edge of a stripe with a varying width, and that of a periodic rectangular pattern for varying pixel pitches of the sensor, thus also in aliased conditions. The analysis includes the investigation of different algorithms of edge localization based on direct fitting or based on the derivative of the edge profile, such as the common centroid method. In addition to the systematic error of these algorithms, the influence of the photon noise (PN) is included in the investigation. A simplified closed form solution for the uncertainty of the edge position caused by the PN is derived. The presented results show that, in the vast majority of cases, the pixel pitch can exceed the Nyquist sampling distance by about 50% without an increase of the uncertainty of edge localization. This allows one to increase the field-of-view without increasing the resolution of the sensor and to decrease the size of the setup by reducing the magnification. Experimental results confirm the simulation results.

  16. Revealing, Reducing, and Representing Uncertainties in New Hydrologic Projections for Climate-changed Futures

    Science.gov (United States)

    Arnold, Jeffrey; Clark, Martyn; Gutmann, Ethan; Wood, Andy; Nijssen, Bart; Rasmussen, Roy

    2016-04-01

    The United States Army Corps of Engineers (USACE) has had primary responsibility for multi-purpose water resource operations on most of the major river systems in the U.S. for more than 200 years. In that time, the USACE projects and programs making up those operations have proved mostly robust against the range of natural climate variability encountered over their operating life spans. However, in some watersheds and for some variables, climate change now is known to be shifting the hydroclimatic baseline around which that natural variability occurs and changing the range of that variability as well. This makes historical stationarity an inappropriate basis for assessing continued project operations under climate-changed futures. That means new hydroclimatic projections are required at multiple scales to inform decisions about specific threats and impacts, and for possible adaptation responses to limit water-resource vulnerabilities and enhance operational resilience. However, projections of possible future hydroclimatologies have myriad complex uncertainties that require explicit guidance for interpreting and using them to inform those decisions about climate vulnerabilities and resilience. Moreover, many of these uncertainties overlap and interact. Recent work, for example, has shown the importance of assessing the uncertainties from multiple sources including: global model structure [Meehl et al., 2005; Knutti and Sedlacek, 2013]; internal climate variability [Deser et al., 2012; Kay et al., 2014]; climate downscaling methods [Gutmann et al., 2012; Mearns et al., 2013]; and hydrologic models [Addor et al., 2014; Vano et al., 2014; Mendoza et al., 2015]. Revealing, reducing, and representing these uncertainties is essential for defining the plausible quantitative climate change narratives required to inform water-resource decision-making. And to be useful, such quantitative narratives, or storylines, of climate change threats and hydrologic impacts must sample

  17. Crop Model Improvement Reduces the Uncertainty of the Response to Temperature of Multi-Model Ensembles

    Science.gov (United States)

    Maiorano, Andrea; Martre, Pierre; Asseng, Senthold; Ewert, Frank; Mueller, Christoph; Roetter, Reimund P.; Ruane, Alex C.; Semenov, Mikhail A.; Wallach, Daniel; Wang, Enli

    2016-01-01

    To improve climate change impact estimates and to quantify their uncertainty, multi-model ensembles (MMEs) have been suggested. Model improvements can improve the accuracy of simulations and reduce the uncertainty of climate change impact assessments. Furthermore, they can reduce the number of models needed in a MME. Herein, 15 wheat growth models of a larger MME were improved through re-parameterization and/or incorporating or modifying heat stress effects on phenology, leaf growth and senescence, biomass growth, and grain number and size using detailed field experimental data from the USDA Hot Serial Cereal experiment (calibration data set). Simulation results from before and after model improvement were then evaluated with independent field experiments from a CIMMYT worldwide field trial network (evaluation data set). Model improvements decreased the variation (10th to 90th model ensemble percentile range) of grain yields simulated by the MME on average by 39% in the calibration data set and by 26% in the independent evaluation data set for crops grown in mean seasonal temperatures greater than 24 C. MME mean squared error in simulating grain yield decreased by 37%. A reduction in MME uncertainty range by 27% increased MME prediction skills by 47%. Results suggest that the mean level of variation observed in field experiments and used as a benchmark can be reached with half the number of models in the MME. Improving crop models is therefore important to increase the certainty of model-based impact assessments and allow more practical, i.e. smaller MMEs to be used effectively.

  18. A Best-Estimate Reactor Core Monitor Using State Feedback Strategies to Reduce Uncertainties

    International Nuclear Information System (INIS)

    Martin, Robert P.; Edwards, Robert M.

    2000-01-01

    The development and demonstration of a new algorithm to reduce modeling and state-estimation uncertainty in best-estimate simulation codes has been investigated. Demonstration is given by way of a prototype reactor core monitor. The architecture of this monitor integrates a control-theory-based, distributed-parameter estimation technique into a production-grade best-estimate simulation code. The Kalman Filter-Sequential Least-Squares (KFSLS) parameter estimation algorithm has been extended for application into the computational environment of the best-estimate simulation code RELAP5-3D. In control system terminology, this configuration can be thought of as a 'best-estimate' observer. The application to a distributed-parameter reactor system involves a unique modal model that approximates physical components, such as the reactor, by describing both states and parameters by an orthogonal expansion. The basic KFSLS parameter estimation is used to dynamically refine a spatially varying (distributed) parameter. The application of the distributed-parameter estimator is expected to complement a traditional nonlinear best-estimate simulation code by providing a mechanism for reducing both code input (modeling) and output (state-estimation) uncertainty in complex, distributed-parameter systems

  19. ECG movement artefacts can be greatly reduced with the aid of a movement absorbing device

    DEFF Research Database (Denmark)

    Harrison, Adrian Paul; Wandall, Kirsten; Thorball, Jørgen

    2007-01-01

    Accurate ECG signal analysis can be confounded by electric lead, and/or electrode movements varying in origin from, for example, hiccups, tremor or patient restlessness. ECG signals recorded using either a conventional electrode holder or with the aid of an electrode holder capable of absorbing...... movement artefacts, were measured on a healthy human subject. Results show a greatly improved stability of the ECG signal recorded using an electrode holder capable of absorbing movement artefacts during periods of lead disturbance, and highlight the movement artefacts that develop when the recording lead...... of a conventional ECG electrode holder is tugged or pulled during theperiod of monitoring. It is concluded that the new design of ECG electrode holder will not only enable clearer signal recordings for clinical assessment, but will reduce the ECG artefacts associated with the transportation of patients, and may...

  20. Great hammerhead sharks swim on their side to reduce transport costs.

    Science.gov (United States)

    Payne, Nicholas L; Iosilevskii, Gil; Barnett, Adam; Fischer, Chris; Graham, Rachel T; Gleiss, Adrian C; Watanabe, Yuuki Y

    2016-07-26

    Animals exhibit various physiological and behavioural strategies for minimizing travel costs. Fins of aquatic animals play key roles in efficient travel and, for sharks, the functions of dorsal and pectoral fins are considered well divided: the former assists propulsion and generates lateral hydrodynamic forces during turns and the latter generates vertical forces that offset sharks' negative buoyancy. Here we show that great hammerhead sharks drastically reconfigure the function of these structures, using an exaggerated dorsal fin to generate lift by swimming rolled on their side. Tagged wild sharks spend up to 90% of time swimming at roll angles between 50° and 75°, and hydrodynamic modelling shows that doing so reduces drag-and in turn, the cost of transport-by around 10% compared with traditional upright swimming. Employment of such a strongly selected feature for such a unique purpose raises interesting questions about evolutionary pathways to hydrodynamic adaptations, and our perception of form and function.

  1. Greatly reduced emission of greenhouse gases from the wood-processing industry

    International Nuclear Information System (INIS)

    2004-01-01

    The strong support for biomass energy in the Norwegian wood-processing industry during the last 10-15 years has contributed greatly to a considerable reduction of the emission of greenhouse gases. The potential for further reductions is primarily linked with the use of oil and involves only a few works. Oil can be replaced by other fuels, and process-technical improvements can reduce the emissions. According to prognoses, emissions will go on decreasing until 2007, when the total emission of greenhouse gases from the wood-processing industry will be about 13 per cent less than in 1998. Carbon dioxide (CO 2 ) amounts to 90 per cent of the total emission, the remaining parts being methane (CH 4 ) from landfills and dumps, and small amounts of N 2 O

  2. Intolerance of uncertainty mediates reduced reward anticipation in major depressive disorder.

    Science.gov (United States)

    Nelson, Brady D; Shankman, Stewart A; Proudfit, Greg H

    2014-04-01

    Reduced reward sensitivity has long been considered a fundamental deficit of major depressive disorder (MDD). One way this deficit has been measured is by an asymmetry in electroencephalogram (EEG) activity between left and right frontal brain regions. MDD has been associated with a reduced frontal EEG asymmetry (i.e., decreased left relative to right) while anticipating reward. However, the mechanism (or mediator) of this association is unclear. The present study examined whether intolerance of uncertainty (IU) mediated the association between depression and reduced reward anticipation. Data were obtained from a prior study reporting reduced frontal EEG asymmetry while anticipating reward in early-onset MDD. Participants included 156 individuals with early-onset MDD-only, panic disorder-only, both (comorbids), or controls. Frontal EEG asymmetry was recorded during an uncertain reward anticipation task. Participants completed a self-report measure of IU. All three psychopathology groups reported greater IU relative to controls. Across all participants, greater IU was associated with a reduced frontal EEG asymmetry. Furthermore, IU mediated the relationship between MDD and frontal EEG asymmetry and results remained significant after controlling for neuroticism, suggesting effects were not due to broad negative affectivity. MDD participants were limited to those with early-onset depression. Measures were collected cross-sectionally, precluding causal relationships. IU mediated the relationship between MDD and reduced reward anticipation, independent of neuroticism. Explanations are provided regarding how IU may contribute to reduced reward anticipation in depression. Overall, IU appears to be an important mechanism for the association between depression and reduced reward anticipation. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Assessing the Expected Value of Research Studies in Reducing Uncertainty and Improving Implementation Dynamics.

    Science.gov (United States)

    Grimm, Sabine E; Dixon, Simon; Stevens, John W

    2017-07-01

    With low implementation of cost-effective health technologies being a problem in many health systems, it is worth considering the potential effects of research on implementation at the time of health technology assessment. Meaningful and realistic implementation estimates must be of dynamic nature. To extend existing methods for assessing the value of research studies in terms of both reduction of uncertainty and improvement in implementation by considering diffusion based on expert beliefs with and without further research conditional on the strength of evidence. We use expected value of sample information and expected value of specific implementation measure concepts accounting for the effects of specific research studies on implementation and the reduction of uncertainty. Diffusion theory and elicitation of expert beliefs about the shape of diffusion curves inform implementation dynamics. We illustrate use of the resulting dynamic expected value of research in a preterm birth screening technology and results are compared with those from a static analysis. Allowing for diffusion based on expert beliefs had a significant impact on the expected value of research in the case study, suggesting that mistakes are made where static implementation levels are assumed. Incorporating the effects of research on implementation resulted in an increase in the expected value of research compared to the expected value of sample information alone. Assessing the expected value of research in reducing uncertainty and improving implementation dynamics has the potential to complement currently used analyses in health technology assessments, especially in recommendations for further research. The combination of expected value of research, diffusion theory, and elicitation described in this article is an important addition to the existing methods of health technology assessment.

  4. Reducing Multisensor Satellite Monthly Mean Aerosol Optical Depth Uncertainty: 1. Objective Assessment of Current AERONET Locations

    Science.gov (United States)

    Li, Jing; Li, Xichen; Carlson, Barbara E.; Kahn, Ralph A.; Lacis, Andrew A.; Dubovik, Oleg; Nakajima, Teruyuki

    2016-01-01

    Various space-based sensors have been designed and corresponding algorithms developed to retrieve aerosol optical depth (AOD), the very basic aerosol optical property, yet considerable disagreement still exists across these different satellite data sets. Surface-based observations aim to provide ground truth for validating satellite data; hence, their deployment locations should preferably contain as much spatial information as possible, i.e., high spatial representativeness. Using a novel Ensemble Kalman Filter (EnKF)- based approach, we objectively evaluate the spatial representativeness of current Aerosol Robotic Network (AERONET) sites. Multisensor monthly mean AOD data sets from Moderate Resolution Imaging Spectroradiometer, Multiangle Imaging Spectroradiometer, Sea-viewing Wide Field-of-view Sensor, Ozone Monitoring Instrument, and Polarization and Anisotropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar are combined into a 605-member ensemble, and AERONET data are considered as the observations to be assimilated into this ensemble using the EnKF. The assessment is made by comparing the analysis error variance (that has been constrained by ground-based measurements), with the background error variance (based on satellite data alone). Results show that the total uncertainty is reduced by approximately 27% on average and could reach above 50% over certain places. The uncertainty reduction pattern also has distinct seasonal patterns, corresponding to the spatial distribution of seasonally varying aerosol types, such as dust in the spring for Northern Hemisphere and biomass burning in the fall for Southern Hemisphere. Dust and biomass burning sites have the highest spatial representativeness, rural and oceanic sites can also represent moderate spatial information, whereas the representativeness of urban sites is relatively localized. A spatial score ranging from 1 to 3 is assigned to each AERONET site based on the uncertainty

  5. Regulatory risk assessments: Is there a need to reduce uncertainty and enhance robustness?

    Science.gov (United States)

    Snodin, D J

    2015-12-01

    A critical evaluation of several recent regulatory risk assessments has been undertaken. These relate to propyl paraben (as a food additive, cosmetic ingredient or pharmaceutical excipient), cobalt (in terms of a safety-based limit for pharmaceuticals) and the cancer Threshold of Toxicological Concern as applied to food contaminants and pharmaceutical impurities. In all cases, a number of concerns can be raised regarding the reliability of the current assessments, some examples being absence of data audits, use of single-dose and/or non-good laboratory practice studies to determine safety metrics, use of a biased data set and questionable methodology and lack of consistency with precedents and regulatory guidance. Drawing on these findings, a set of recommendations is provided to reduce uncertainty and improve the quality and robustness of future regulatory risk assessments. © The Author(s) 2015.

  6. Uncertainty and risk management after the Great Moderation: the role of risk (mis)management by financial institutions

    NARCIS (Netherlands)

    Blommestein, H.J.; Hoogduin, L.H.; Peeters, J.J.W.

    2009-01-01

    Since the early eighties volatility of GDP and inflation has been declining steadily in many countries. Financial innovation has been identified as one of the key factors driving this „Great Moderation‟. Financial innovation was considered to have improved significantly the allocation and sharing of

  7. Predicting Statistical Response and Extreme Events in Uncertainty Quantification through Reduced-Order Models

    Science.gov (United States)

    Qi, D.; Majda, A.

    2017-12-01

    A low-dimensional reduced-order statistical closure model is developed for quantifying the uncertainty in statistical sensitivity and intermittency in principal model directions with largest variability in high-dimensional turbulent system and turbulent transport models. Imperfect model sensitivity is improved through a recent mathematical strategy for calibrating model errors in a training phase, where information theory and linear statistical response theory are combined in a systematic fashion to achieve the optimal model performance. The idea in the reduced-order method is from a self-consistent mathematical framework for general systems with quadratic nonlinearity, where crucial high-order statistics are approximated by a systematic model calibration procedure. Model efficiency is improved through additional damping and noise corrections to replace the expensive energy-conserving nonlinear interactions. Model errors due to the imperfect nonlinear approximation are corrected by tuning the model parameters using linear response theory with an information metric in a training phase before prediction. A statistical energy principle is adopted to introduce a global scaling factor in characterizing the higher-order moments in a consistent way to improve model sensitivity. Stringent models of barotropic and baroclinic turbulence are used to display the feasibility of the reduced-order methods. Principal statistical responses in mean and variance can be captured by the reduced-order models with accuracy and efficiency. Besides, the reduced-order models are also used to capture crucial passive tracer field that is advected by the baroclinic turbulent flow. It is demonstrated that crucial principal statistical quantities like the tracer spectrum and fat-tails in the tracer probability density functions in the most important large scales can be captured efficiently with accuracy using the reduced-order tracer model in various dynamical regimes of the flow field with

  8. Using internal discharge data in a distributed conceptual model to reduce uncertainty in streamflow simulations

    Science.gov (United States)

    Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.

    2011-12-01

    Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.

  9. Do regional methods really help reduce uncertainties in flood frequency analyses?

    Science.gov (United States)

    Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric

    2013-04-01

    Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged

  10. High resolution remote sensing for reducing uncertainties in urban forest carbon offset life cycle assessments.

    Science.gov (United States)

    Tigges, Jan; Lakes, Tobia

    2017-10-04

    Urban forests reduce greenhouse gas emissions by storing and sequestering considerable amounts of carbon. However, few studies have considered the local scale of urban forests to effectively evaluate their potential long-term carbon offset. The lack of precise, consistent and up-to-date forest details is challenging for long-term prognoses. Therefore, this review aims to identify uncertainties in urban forest carbon offset assessment and discuss the extent to which such uncertainties can be reduced by recent progress in high resolution remote sensing. We do this by performing an extensive literature review and a case study combining remote sensing and life cycle assessment of urban forest carbon offset in Berlin, Germany. Recent progress in high resolution remote sensing and methods is adequate for delivering more precise details on the urban tree canopy, individual tree metrics, species, and age structures compared to conventional land use/cover class approaches. These area-wide consistent details can update life cycle inventories for more precise future prognoses. Additional improvements in classification accuracy can be achieved by a higher number of features derived from remote sensing data of increasing resolution, but first studies on this subject indicated that a smart selection of features already provides sufficient data that avoids redundancies and enables more efficient data processing. Our case study from Berlin could use remotely sensed individual tree species as consistent inventory of a life cycle assessment. However, a lack of growth, mortality and planting data forced us to make assumptions, therefore creating uncertainty in the long-term prognoses. Regarding temporal changes and reliable long-term estimates, more attention is required to detect changes of gradual growth, pruning and abrupt changes in tree planting and mortality. As such, precise long-term urban ecological monitoring using high resolution remote sensing should be intensified

  11. Reducing consistency in human realism increases the uncanny valley effect; increasing category uncertainty does not.

    Science.gov (United States)

    MacDorman, Karl F; Chattopadhyay, Debaleena

    2016-01-01

    Human replicas may elicit unintended cold, eerie feelings in viewers, an effect known as the uncanny valley. Masahiro Mori, who proposed the effect in 1970, attributed it to inconsistencies in the replica's realism with some of its features perceived as human and others as nonhuman. This study aims to determine whether reducing realism consistency in visual features increases the uncanny valley effect. In three rounds of experiments, 548 participants categorized and rated humans, animals, and objects that varied from computer animated to real. Two sets of features were manipulated to reduce realism consistency. (For humans, the sets were eyes-eyelashes-mouth and skin-nose-eyebrows.) Reducing realism consistency caused humans and animals, but not objects, to appear eerier and colder. However, the predictions of a competing theory, proposed by Ernst Jentsch in 1906, were not supported: The most ambiguous representations-those eliciting the greatest category uncertainty-were neither the eeriest nor the coldest. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  12. How uncertainty analysis of streamflow data can reduce costs and promote robust decisions in water management applications

    Science.gov (United States)

    McMillan, Hilary; Seibert, Jan; Petersen-Overleir, Asgeir; Lang, Michel; White, Paul; Snelder, Ton; Rutherford, Kit; Krueger, Tobias; Mason, Robert; Kiang, Julie

    2017-07-01

    Streamflow data are used for important environmental and economic decisions, such as specifying and regulating minimum flows, managing water supplies, and planning for flood hazards. Despite significant uncertainty in most flow data, the flow series for these applications are often communicated and used without uncertainty information. In this commentary, we argue that proper analysis of uncertainty in river flow data can reduce costs and promote robust conclusions in water management applications. We substantiate our argument by providing case studies from Norway and New Zealand where streamflow uncertainty analysis has uncovered economic costs in the hydropower industry, improved public acceptance of a controversial water management policy, and tested the accuracy of water quality trends. We discuss the need for practical uncertainty assessment tools that generate multiple flow series realizations rather than simple error bounds. Although examples of such tools are in development, considerable barriers for uncertainty analysis and communication still exist for practitioners, and future research must aim to provide easier access and usability of uncertainty estimates. We conclude that flow uncertainty analysis is critical for good water management decisions.

  13. Application of stochastic programming to reduce uncertainty in quality-based supply planning of slaughterhouses

    NARCIS (Netherlands)

    Rijpkema, W.A.; Hendrix, E.M.T.; Rossi, R.; Vorst, van der J.G.A.J.

    2016-01-01

    To match products of different quality with end market preferences under supply uncertainty, it is crucial to integrate product quality information in logistics decision making. We present a case of this integration in a meat processing company that faces uncertainty in delivered livestock quality.

  14. Uncertainty relations and reduced density matrices: Mapping many-body quantum mechanics onto four particles

    Science.gov (United States)

    Mazziotti, David A.; Erdahl, Robert M.

    2001-04-01

    For the description of ground-state correlation phenomena an accurate mapping of many-body quantum mechanics onto four particles is developed. The energy for a quantum system with no more than two-particle interactions may be expressed in terms of a two-particle reduced density matrix (2-RDM), but variational optimization of the 2-RDM requires that it corresponds to an N-particle wave function. We derive N-representability conditions on the 2-RDM that guarantee the validity of the uncertainty relations for all operators with two-particle interactions. One of these conditions is shown to be necessary and sufficient to make the RDM solutions of the dispersion condition equivalent to those from the contracted Schrödinger equation (CSE) [Mazziotti, Phys. Rev. A 57, 4219 (1998)]. In general, the CSE is a stronger N-representability condition than the dispersion condition because the CSE implies the dispersion condition as well as additional N-representability constraints from the Hellmann-Feynman theorem. Energy minimization subject to the representability constraints is performed for a boson model with 10, 30, and 75 particles. Even when traditional wave-function methods fail at large perturbations, the present method yields correlation energies within 2%.

  15. SunShot solar power reduces costs and uncertainty in future low-carbon electricity systems.

    Science.gov (United States)

    Mileva, Ana; Nelson, James H; Johnston, Josiah; Kammen, Daniel M

    2013-08-20

    The United States Department of Energy's SunShot Initiative has set cost-reduction targets of $1/watt for central-station solar technologies. We use SWITCH, a high-resolution electricity system planning model, to study the implications of achieving these targets for technology deployment and electricity costs in western North America, focusing on scenarios limiting carbon emissions to 80% below 1990 levels by 2050. We find that achieving the SunShot target for solar photovoltaics would allow this technology to provide more than a third of electric power in the region, displacing natural gas in the medium term and reducing the need for nuclear and carbon capture and sequestration (CCS) technologies, which face technological and cost uncertainties, by 2050. We demonstrate that a diverse portfolio of technological options can help integrate high levels of solar generation successfully and cost-effectively. The deployment of GW-scale storage plays a central role in facilitating solar deployment and the availability of flexible loads could increase the solar penetration level further. In the scenarios investigated, achieving the SunShot target can substantially mitigate the cost of implementing a carbon cap, decreasing power costs by up to 14% and saving up to $20 billion ($2010) annually by 2050 relative to scenarios with Reference solar costs.

  16. Tailored complex degree of mutual coherence for plane-of-interest interferometry with reduced measurement uncertainty

    Science.gov (United States)

    Fütterer, G.

    2017-10-01

    A problem of interferometers is the elimination of parasitic reflections. Parasitic reflections and modulated intensity signals, which are not related to the reference surface (REF) or the surface under test (SUT) in a direct way, can increase the measurement uncertainty significantly. In some situations standard methods might be used in order to eliminate reflections from the backside of the optical element under test. For instance, match the test object to an absorber, while taking the complex refractive index into account, can cancel out back reflections completely. This causes additional setup time and chemical contamination. In some situations an angular offset might be combined with an aperture stop. This reduces spatial resolution and it does not work if the disturbing wave field propagates in the same direction as the wave field, which propagates from the SUT. However, a stack of surfaces is a problem. An increased spectral bandwidth might be used in order to obtain a separation of the plane-of-interest from other planes. Depending on the interferometer used, this might require an optical path difference of zero or it might cause a reduction of the visibility to V embodiment of a modified interferometer, will be discussed.

  17. Strategies for Reduced-Order Models in Uncertainty Quantification of Complex Turbulent Dynamical Systems

    Science.gov (United States)

    Qi, Di

    Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are

  18. Forecasting the Number of Soil Samples Required to Reduce Remediation Cost Uncertainty

    OpenAIRE

    Demougeot-Renard, Hélène; de Fouquet, Chantal; Renard, Philippe

    2008-01-01

    Sampling scheme design is an important step in the management of polluted sites. It largely controls the accuracy of remediation cost estimates. In practice, however, sampling is seldom designed to comply with a given level of remediation cost uncertainty. In this paper, we present a new technique that allows one to estimate of the number of samples that should be taken at a given stage of investigation to reach a forecasted level of accuracy. The uncertainty is expressed both in terms of vol...

  19. Reducing Dose Uncertainty for Spot-Scanning Proton Beam Therapy of Moving Tumors by Optimizing the Spot Delivery Sequence

    International Nuclear Information System (INIS)

    Li, Heng; Zhu, X. Ronald; Zhang, Xiaodong

    2015-01-01

    Purpose: To develop and validate a novel delivery strategy for reducing the respiratory motion–induced dose uncertainty of spot-scanning proton therapy. Methods and Materials: The spot delivery sequence was optimized to reduce dose uncertainty. The effectiveness of the delivery sequence optimization was evaluated using measurements and patient simulation. One hundred ninety-one 2-dimensional measurements using different delivery sequences of a single-layer uniform pattern were obtained with a detector array on a 1-dimensional moving platform. Intensity modulated proton therapy plans were generated for 10 lung cancer patients, and dose uncertainties for different delivery sequences were evaluated by simulation. Results: Without delivery sequence optimization, the maximum absolute dose error can be up to 97.2% in a single measurement, whereas the optimized delivery sequence results in a maximum absolute dose error of ≤11.8%. In patient simulation, the optimized delivery sequence reduces the mean of fractional maximum absolute dose error compared with the regular delivery sequence by 3.3% to 10.6% (32.5-68.0% relative reduction) for different patients. Conclusions: Optimizing the delivery sequence can reduce dose uncertainty due to respiratory motion in spot-scanning proton therapy, assuming the 4-dimensional CT is a true representation of the patients' breathing patterns.

  20. Reduced dose uncertainty in MRI-based polymer gel dosimetry using parallel RF transmission with multiple RF sources

    International Nuclear Information System (INIS)

    Sang-Young Kim; Jung-Hoon Lee; Jin-Young Jung; Do-Wan Lee; Seu-Ran Lee; Bo-Young Choe; Hyeon-Man Baek; Korea University of Science and Technology, Daejeon; Dae-Hyun Kim; Jung-Whan Min; Ji-Yeon Park

    2014-01-01

    In this work, we present the feasibility of using a parallel RF transmit with multiple RF sources imaging method (MultiTransmit imaging) in polymer gel dosimetry. Image quality and B 1 field homogeneity was statistically better in the MultiTransmit imaging method than in conventional single source RF transmission imaging method. In particular, the standard uncertainty of R 2 was lower on the MultiTransmit images than on the conventional images. Furthermore, the MultiTransmit measurement showed improved dose resolution. Improved image quality and B 1 homogeneity results in reduced dose uncertainty, thereby suggesting the feasibility of MultiTransmit MR imaging in gel dosimetry. (author)

  1. Use of Paired Simple and Complex Models to Reduce Predictive Bias and Quantify Uncertainty

    DEFF Research Database (Denmark)

    Doherty, John; Christensen, Steen

    2011-01-01

    -constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology...... of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration...... that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights...

  2. Derivation of RCM-driven potential evapotranspiration for hydrological climate change impact analysis in Great Britain: a comparison of methods and associated uncertainty in future projections

    Directory of Open Access Journals (Sweden)

    C. Prudhomme

    2013-04-01

    the range of uncertainty defined by the ensemble of 12 PET equations. The changes show a clear northwest–southeast gradient of PET increase with largest (smallest changes in the northwest in January (July and October respectively. However, the range in magnitude of PET changes due to the choice of PET method shown in this study for Great Britain suggests that PET uncertainty is a challenge facing the assessment of climate change impact on hydrology mostly ignored up to now.

  3. CLIMB - Climate induced changes on the hydrology of mediterranean basins - Reducing uncertainties and quantifying risk

    Science.gov (United States)

    Ludwig, Ralf

    2010-05-01

    According to future climate projections, Mediterranean countries are at high risk for an even pronounced susceptibility to changes in the hydrological budget and extremes. These changes are expected to have severe direct impacts on the management of water resources. Threats include severe droughts and extreme flooding, salinization of coastal aquifers, degradation of fertile soils and desertification due to poor and unsustainable water management practices. It can be foreseen that, unless appropriate adaptation measures are undertaken, the changes in the hydrologic cycle will give rise to an increasing potential for tension and conflict among the political and economic actors in this vulnerable region. The presented project initiative CLIMB, funded under EC's 7th Framework Program (FP7-ENV-2009-1), has started in January 2010. In its 4-year design, it shall analyze ongoing and future climate induced changes in hydrological budgets and extremes across the Mediterranean and neighboring regions. This is undertaken in study sites located in Sardinia, Northern Italy, Southern France, Tunisia, Egypt and the Palestinian-administered area Gaza. The work plan is targeted to selected river or aquifer catchments, where the consortium will employ a combination of novel field monitoring and remote sensing concepts, data assimilation, integrated hydrologic (and biophysical) modeling and socioeconomic factor analyses to reduce existing uncertainties in climate change impact analysis. Advanced climate scenario analysis will be employed and available ensembles of regional climate model simulations will be downscaling. This process will provide the drivers for an ensemble of hydro(-geo)logical models with different degrees of complexity in terms of process description and level of integration. The results of hydrological modeling and socio-economic factor analysis will enable the development of a GIS-based Vulnerability and Risk Assessment Tool. This tool will serve as a platform

  4. Improving Chemical EOR Simulations and Reducing the Subsurface Uncertainty Using Downscaling Conditioned to Tracer Data

    KAUST Repository

    Torrealba, Victor A.

    2017-10-02

    distributions as soft data. The method honors the fluid material balance and geological features from the coarse model. A workflow is outlined to address uncertainties in geological properties that can be reduced by integrating dynamic data such as sweep efficiency from interwell tracers. We provide several test cases and demonstrate the applicability of the proposed method to improve the history-match of a chemical EOR pilot. Further, we evaluate the fitness of different heterogeneity measures for grid-ranking of CEOR processes.

  5. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    Science.gov (United States)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was

  6. Reducing uncertainty of Monte Carlo estimated fatigue damage in offshore wind turbines using FORM

    DEFF Research Database (Denmark)

    H. Horn, Jan-Tore; Jensen, Jørgen Juncher

    2016-01-01

    Uncertainties related to fatigue damage estimation of non-linear systems are highly dependent on the tail behaviour and extreme values of the stress range distribution. By using a combination of the First Order Reliability Method (FORM) and Monte Carlo simulations (MCS), the accuracy of the fatigue...

  7. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    Science.gov (United States)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  8. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    Science.gov (United States)

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  9. The impact of reducing car weight on global emissions: the future fleet in Great Britain

    Science.gov (United States)

    Serrenho, André Cabrera; Norman, Jonathan B.; Allwood, Julian M.

    2017-05-01

    Current European policies define targets for future direct emissions of new car sales that foster a fast transition to electric drivetrain technologies. However, these targets do not consider the emissions produced in electricity generation and material production, and therefore fail to incentivise car manufacturers to consider the benefits of vehicle weight reduction. In this paper, we examine the potential benefits of limiting the average weight and altering the material composition of new cars in terms of global greenhouse gas emissions produced during the use phase, electricity generation and material production. We anticipate the emissions savings for the future car fleet in Great Britain until 2050 for various alternative futures, using a dynamic material flow analysis of ferrous metals and aluminium, and considering an evolving demand for car use. The results suggest that fostering vehicle weight reduction could produce greater cumulative emissions savings by 2050 than those obtained by incentivising a fast transition to electric drivetrains, unless there is an extreme decarbonization of the electricity grid. Savings promoted by weight reduction are immediate and do not depend on the pace of decarbonization of the electricity grid. Weight reduction may produce the greatest savings when mild steel in the car body is replaced with high-strength steel. This article is part of the themed issue 'Material demand reduction'.

  10. Blockchain to Rule the Waves - Nascent Design Principles for Reducing Risk and Uncertainty in Decentralized Environments

    DEFF Research Database (Denmark)

    Nærland, Kristoffer; Müller-Bloch, Christoph; Beck, Roman

    2017-01-01

    Many decentralized, inter-organizational environments such as supply chains are characterized by high transactional uncertainty and risk. At the same time, blockchain technology promises to mitigate these issues by introducing certainty into economic transactions. This paper discusses the findings...... of a Design Science Research project involving the construction and evaluation of an information technology artifact in collaboration with Maersk, a leading international shipping company, where central documents in shipping, such as the Bill of Lading, are turned into a smart contract on blockchain. Based...... on our insights from the project, we provide first evidence for preliminary design principles for applications that aim to mitigate the transactional risk and uncertainty in decentralized environments using blockchain. Both the artifact and the first evidence for emerging design principles are novel...

  11. Blockchain to Rule the Waves - Nascent Design Principles for Reducing Risk and Uncertainty in Decentralized Environments

    OpenAIRE

    Nærland, Kristoffer; Müller-Bloch, Christoph; Beck, Roman; Palmund, Søren

    2017-01-01

    Many decentralized, inter-organizational environments such as supply chains are characterized by high transactional uncertainty and risk. At the same time, blockchain technology promises to mitigate these issues by introducing certainty into economic transactions. This paper discusses the findings of a Design Science Research project involving the construction and evaluation of an information technology artifact in collaboration with Maersk, a leading international shipping company, where cen...

  12. Reducing uncertainty in wind turbine blade health inspection with image processing techniques

    Science.gov (United States)

    Zhang, Huiyi

    Structural health inspection has been widely applied in the operation of wind farms to find early cracks in wind turbine blades (WTBs). Increased numbers of turbines and expanded rotor diameters are driving up the workloads and safety risks for site employees. Therefore, it is important to automate the inspection process as well as minimize the uncertainties involved in routine blade health inspection. In addition, crack documentation and trending is vital to assess rotor blade and turbine reliability in the 20 year designed life span. A new crack recognition and classification algorithm is described that can support automated structural health inspection of the surface of large composite WTBs. The first part of the study investigated the feasibility of digital image processing in WTB health inspection and defined the capability of numerically detecting cracks as small as hairline thickness. The second part of the study identified and analyzed the uncertainty of the digital image processing method. A self-learning algorithm was proposed to recognize and classify cracks without comparing a blade image to a library of crack images. The last part of the research quantified the uncertainty in the field conditions and the image processing methods.

  13. A programme of research to set priorities and reduce uncertainties for the prevention and treatment of skin disease

    OpenAIRE

    Thomas, K. S.; Batchelor, J. M.; Bath-Hextall, F.; Chalmers, J. R.; Clarke, T.; Crowe, S.; Delamere, F. M.; Eleftheriadou, V.; Evans, N.; Firkins, L.; Greenlaw, N.; Lansbury, L.; Lawton, S.; Layfield, C.; Leonardi-Bee, J.

    2016-01-01

    BACKGROUND: Skin diseases are very common and can have a large impact on the quality of life of patients and caregivers. This programme addressed four diseases: (1) eczema, (2) vitiligo, (3) squamous cell skin cancer (SCC) and (4) pyoderma gangrenosum (PG). OBJECTIVE: To set priorities and reduce uncertainties for the treatment and prevention of skin disease in our four chosen diseases. DESIGN: Mixed methods including eight systematic reviews, three prioritisation exercises, tw...

  14. Use of Atmospheric Budget to Reduce Uncertainty in Estimated Water Availability over South Asia from Different Reanalyses

    Science.gov (United States)

    Sebastian, Dawn Emil; Pathak, Amey; Ghosh, Subimal

    2016-07-01

    Disagreements across different reanalyses over South Asia result into uncertainty in assessment of water availability, which is computed as the difference between Precipitation and Evapotranspiration (P-E). Here, we compute P-E directly from atmospheric budget with divergence of moisture flux for different reanalyses and find improved correlation with observed values of P-E, acquired from station and satellite data. We also find reduced closure terms for water cycle computed with atmospheric budget, analysed over South Asian landmass, when compared to that obtained with individual values of P and E. The P-E value derived with atmospheric budget is more consistent with energy budget, when we use top-of-atmosphere radiation for the same. For analysing water cycle, we use runoff from Global Land Data Assimilation System, and water storage from Gravity Recovery and Climate Experiment. We find improvements in agreements across different reanalyses, in terms of inter-annual cross correlation when atmospheric budget is used to estimate P-E and hence, emphasize to use the same for estimations of water availability in South Asia to reduce uncertainty. Our results on water availability with reduced uncertainty over highly populated monsoon driven South Asia will be useful for water management and agricultural decision making.

  15. Use of Atmospheric Budget to Reduce Uncertainty in Estimated Water Availability over South Asia from Different Reanalyses.

    Science.gov (United States)

    Sebastian, Dawn Emil; Pathak, Amey; Ghosh, Subimal

    2016-07-08

    Disagreements across different reanalyses over South Asia result into uncertainty in assessment of water availability, which is computed as the difference between Precipitation and Evapotranspiration (P-E). Here, we compute P-E directly from atmospheric budget with divergence of moisture flux for different reanalyses and find improved correlation with observed values of P-E, acquired from station and satellite data. We also find reduced closure terms for water cycle computed with atmospheric budget, analysed over South Asian landmass, when compared to that obtained with individual values of P and E. The P-E value derived with atmospheric budget is more consistent with energy budget, when we use top-of-atmosphere radiation for the same. For analysing water cycle, we use runoff from Global Land Data Assimilation System, and water storage from Gravity Recovery and Climate Experiment. We find improvements in agreements across different reanalyses, in terms of inter-annual cross correlation when atmospheric budget is used to estimate P-E and hence, emphasize to use the same for estimations of water availability in South Asia to reduce uncertainty. Our results on water availability with reduced uncertainty over highly populated monsoon driven South Asia will be useful for water management and agricultural decision making.

  16. Gallium ion implantation greatly reduces thermal conductivity and enhances electronic one of ZnO nanowires

    Directory of Open Access Journals (Sweden)

    Minggang Xia

    2014-05-01

    Full Text Available The electrical and thermal conductivities are measured for individual zinc oxide (ZnO nanowires with and without gallium ion (Ga+ implantation at room temperature. Our results show that Ga+ implantation enhances electrical conductivity by one order of magnitude from 1.01 × 103 Ω−1m−1 to 1.46 × 104 Ω−1m−1 and reduces its thermal conductivity by one order of magnitude from 12.7 Wm−1K−1 to 1.22 Wm−1K−1 for ZnO nanowires of 100 nm in diameter. The measured thermal conductivities are in good agreement with those in theoretical simulation. The increase of electrical conductivity origins in electron donor doping by Ga+ implantation and the decrease of thermal conductivity is due to the longitudinal and transverse acoustic phonons scattering by Ga+ point scattering. For pristine ZnO nanowires, the thermal conductivity decreases only two times when its diameter reduces from 100 nm to 46 nm. Therefore, Ga+-implantation may be a more effective method than diameter reduction in improving thermoelectric performance.

  17. Artificial Light at Night Reduces Daily Energy Expenditure in Breeding Great Tits (Parus major

    Directory of Open Access Journals (Sweden)

    Anouk A. M. H. Welbers

    2017-05-01

    Full Text Available The ecological impact of artificial light at night (ALAN is an increasingly recognized process that accompanies expanding urbanization. Yet, we have limited knowledge on the impact of ALAN on wild species, and on the potential to mitigate any negative effects by using different light sources and colors. In birds, effects of ALAN on activity levels are reported for several species and, hence, their daily energy expenditure (DEE may be affected. DEE is a potent mediator of life-history trade-offs and fitness and thus an important aspect to consider when examining the potential long-term ecological effects of ALAN. Previous work has suggested that birds exposed to ALAN show higher levels of provisioning and nocturnal activity, suggesting that white ALAN increases DEE. Other factors regulating DEE, such as provisioning behavior and food availability, might also respond to ALAN and thus indirectly affect DEE. We tested the hypothesis that ALAN increases DEE using an experimental setup where four previously unlit transects were illuminated with either white, green, or red LED light, or left dark as a control treatment. This setup was replicated in eight locations across the Netherlands. We measured DEE of our focal species, the great tit (Parus major, using a novel doubly labeled water technique that uses breath rather than blood samples. Contrary to our expectations, birds feeding their offspring under white and green ALAN showed lower DEE compared to birds in the control dark treatment. Differences in chick provisioning activity did not explain this result, as neither visit rates nor daily activity timing was affected by light treatment. However, food availability under white and green light was much higher compared to red light and the dark control. This difference strongly suggests that the lower DEE under white and green ALAN sites is a consequence of higher food availability in these treatments. This result shows that there can be positive

  18. Forest management under climatic and social uncertainty: trade-offs between reducing climate change impacts and fostering adaptive capacity.

    Science.gov (United States)

    Seidl, Rupert; Lexer, Manfred J

    2013-01-15

    The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to

  19. The uncertainty of crop yield projections is reduced by improved temperature response functions

    DEFF Research Database (Denmark)

    Wang, Enli; Martre, Pierre; Zhao, Zhigan

    2017-01-01

    , we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for >50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 °C to 33 °C. We derived a set of new temperature......Quality) and analysing their results against the HSC data and an additional global dataset from the International Heat Stress Genotpye Experiment (IHSGE)8 carried out by the International Maize and Wheat Improvement Center (CIMMYT). More importantly, we derive, based on newest knowledge and data, a set of new...

  20. Uncertainties in radioecological assessment models-Their nature and approaches to reduce them

    International Nuclear Information System (INIS)

    Kirchner, G.; Steiner, M.

    2008-01-01

    Radioecological assessment models are necessary tools for estimating the radiation exposure of humans and non-human biota. This paper focuses on factors affecting their predictive accuracy, discusses the origin and nature of the different contributions to uncertainty and variability and presents approaches to separate and quantify them. The key role of the conceptual model, notably in relation to its structure and complexity, as well as the influence of the number and type of input parameters, are highlighted. Guidelines are provided to improve the degree of reliability of radioecological models

  1. Genetic Correlations Greatly Increase Mutational Robustness and Can Both Reduce and Enhance Evolvability.

    Directory of Open Access Journals (Sweden)

    Sam F Greenbury

    2016-03-01

    Full Text Available Mutational neighbourhoods in genotype-phenotype (GP maps are widely believed to be more likely to share characteristics than expected from random chance. Such genetic correlations should strongly influence evolutionary dynamics. We explore and quantify these intuitions by comparing three GP maps-a model for RNA secondary structure, the HP model for protein tertiary structure, and the Polyomino model for protein quaternary structure-to a simple random null model that maintains the number of genotypes mapping to each phenotype, but assigns genotypes randomly. The mutational neighbourhood of a genotype in these GP maps is much more likely to contain genotypes mapping to the same phenotype than in the random null model. Such neutral correlations can be quantified by the robustness to mutations, which can be many orders of magnitude larger than that of the null model, and crucially, above the critical threshold for the formation of large neutral networks of mutationally connected genotypes which enhance the capacity for the exploration of phenotypic novelty. Thus neutral correlations increase evolvability. We also study non-neutral correlations: Compared to the null model, i If a particular (non-neutral phenotype is found once in the 1-mutation neighbourhood of a genotype, then the chance of finding that phenotype multiple times in this neighbourhood is larger than expected; ii If two genotypes are connected by a single neutral mutation, then their respective non-neutral 1-mutation neighbourhoods are more likely to be similar; iii If a genotype maps to a folding or self-assembling phenotype, then its non-neutral neighbours are less likely to be a potentially deleterious non-folding or non-assembling phenotype. Non-neutral correlations of type i and ii reduce the rate at which new phenotypes can be found by neutral exploration, and so may diminish evolvability, while non-neutral correlations of type iii may instead facilitate evolutionary exploration

  2. Genetic Correlations Greatly Increase Mutational Robustness and Can Both Reduce and Enhance Evolvability

    Science.gov (United States)

    Greenbury, Sam F.; Schaper, Steffen; Ahnert, Sebastian E.; Louis, Ard A.

    2016-01-01

    Mutational neighbourhoods in genotype-phenotype (GP) maps are widely believed to be more likely to share characteristics than expected from random chance. Such genetic correlations should strongly influence evolutionary dynamics. We explore and quantify these intuitions by comparing three GP maps—a model for RNA secondary structure, the HP model for protein tertiary structure, and the Polyomino model for protein quaternary structure—to a simple random null model that maintains the number of genotypes mapping to each phenotype, but assigns genotypes randomly. The mutational neighbourhood of a genotype in these GP maps is much more likely to contain genotypes mapping to the same phenotype than in the random null model. Such neutral correlations can be quantified by the robustness to mutations, which can be many orders of magnitude larger than that of the null model, and crucially, above the critical threshold for the formation of large neutral networks of mutationally connected genotypes which enhance the capacity for the exploration of phenotypic novelty. Thus neutral correlations increase evolvability. We also study non-neutral correlations: Compared to the null model, i) If a particular (non-neutral) phenotype is found once in the 1-mutation neighbourhood of a genotype, then the chance of finding that phenotype multiple times in this neighbourhood is larger than expected; ii) If two genotypes are connected by a single neutral mutation, then their respective non-neutral 1-mutation neighbourhoods are more likely to be similar; iii) If a genotype maps to a folding or self-assembling phenotype, then its non-neutral neighbours are less likely to be a potentially deleterious non-folding or non-assembling phenotype. Non-neutral correlations of type i) and ii) reduce the rate at which new phenotypes can be found by neutral exploration, and so may diminish evolvability, while non-neutral correlations of type iii) may instead facilitate evolutionary exploration and so

  3. Reducing uncertainty in sustainable interpersonal service relationships: the role of aesthetics.

    Science.gov (United States)

    Xenakis, Ioannis

    2018-05-01

    Sustainable interpersonal service relationships (SISRs) are the outcome of a design process that supports situated meaningful interactions between those being served and those in service. Service design is not just directed to simply satisfy the ability to perceive the psychological state of others, but more importantly, it should aim at preserving these relationships in relation to the contextual requirements that they functionally need, in order to be or remain sustainable. However, SISRs are uncertain since they have many possibilities to be in error in the sense that the constructed, situated meanings may finally be proven unsuccessful for the anticipations and the goals of those people engaged in a SISR. The endeavor of this paper is to show that aesthetic behavior plays a crucial role in the reduction of the uncertainty that characterizes such relationships. Aesthetic behavior, as an organized network of affective and cognitive processes, has an anticipatory evaluative function with a strong influence on perception by providing significance and value for those aspects in SISRs that exhibit many possibilities to serve goals that correspond to sustainable challenges. Thus, aesthetic behavior plays an important role in the construction of meanings that are related to both empathic and contextual aspects that constitute the entire situation in which a SISR takes place. Aesthetic behavior has a strong influence in meaning-making, motivating the selection of actions that contribute to our initial goal of interacting with uncertainty, to make the world a bit less puzzling and, thus, to improve our lives, or in other words, to design.

  4. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries

    Science.gov (United States)

    Sutton, Abigail M.; Rudd, Murray A.

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on `expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent `shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  5. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries.

    Science.gov (United States)

    Sutton, Abigail M; Rudd, Murray A

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on 'expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent 'shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  6. Health technology assessment and primary data collection for reducing uncertainty in decision making.

    Science.gov (United States)

    Goeree, Ron; Levin, Les; Chandra, Kiran; Bowen, James M; Blackhouse, Gord; Tarride, Jean-Eric; Burke, Natasha; Bischof, Matthias; Xie, Feng; O'Reilly, Daria

    2009-05-01

    Health care expenditures continue to escalate, and pressures for increased spending will continue. Health care decision makers from publicly financed systems, private insurance companies, or even from individual health care institutions, will continue to be faced with making difficult purchasing, access, and reimbursement decisions. As a result, decision makers are increasingly turning to evidence-based platforms to help control costs and make the most efficient use of existing resources. Most tools used to assist with evidence-based decision making focus on clinical outcomes. Health technology assessment (HTA) is increasing in popularity because it also considers other factors important for decision making, such as cost, social and ethical values, legal issues, and factors such as the feasibility of implementation. In some jurisdictions, HTAs have also been supplemented with primary data collection to help address uncertainty that may still exist after conducting a traditional HTA. The HTA process adopted in Ontario, Canada, is unique in that assessments are also made to determine what primary data research should be conducted and what should be collected in these studies. In this article, concerns with the traditional HTA process are discussed, followed by a description of the HTA process that has been established in Ontario, with a particular focus on the data collection program followed by the Programs for Assessment of Technology in Health Research Institute. An illustrative example is used to show how the Ontario HTA process works and the role value of information analyses plays in addressing decision uncertainty, determining research feasibility, and determining study data collection needs.

  7. Use of screening techniques to reduce uncertainty in risk assessment at a former manufactured gas plant site

    International Nuclear Information System (INIS)

    Logan, C.M.; Walden, R.H.; Baker, S.R.; Pekar, Z.; LaKind, J.S.; MacFarlane, I.D.

    1995-01-01

    Preliminary analysis of risks from a former manufactured gas plant (MGP) site revealed six media associated with potential exposure pathways: soils, air, surface water, groundwater, estuarine sediments, and aquatic biota. Contaminants of concern (COCs) include polycyclic aromatic hydrocarbons, volatile organic hydrocarbons, metals, cyanide, and PCBs. Available chemical data, including site-specific measurements and existing data from other sources (e.g., agency monitoring programs, Chesapeake Bay Program), were evaluated for potential utility in risk assessment. Where sufficient data existed, risk calculations were performed using central tendency and reasonable maximum exposure estimates. Where site-specific data were not available, risks were estimated using conservatively high default assumptions for dose and/or exposure duration. Because of the large number of potential exposure pathways and COCs, a sensitivity analysis was conducted to determine which information most influences risk assessment outcome so that any additional data collection to reduce uncertainty can be cost-effectively targeted. The sensitivity analysis utilized two types of information: (1) the impact that uncertainty in risk input values has on output risk estimates, and (2) the potential improvement in key risk input values, and consequently output values, if better site-specific data were available. A decision matrix using both quantitative and qualitative information was developed to prioritize sampling strategies to minimize uncertainty in the final risk assessment

  8. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  9. Search for Expectancy-Inconsistent Information Reduces Uncertainty Better: The Role of Cognitive Capacity.

    Science.gov (United States)

    Strojny, Paweł; Kossowska, Małgorzata; Strojny, Agnieszka

    2016-01-01

    Motivation and cognitive capacity are key factors in people's everyday struggle with uncertainty. However, the exact nature of their interplay in various contexts still needs to be revealed. The presented paper reports on two experimental studies which aimed to examine the joint consequences of motivational and cognitive factors for preferences regarding incomplete information expansion. In Study 1 we demonstrate the interactional effect of motivation and cognitive capacity on information preference. High need for closure resulted in a stronger relative preference for expectancy-inconsistent information among non-depleted individuals, but the opposite among cognitively depleted ones. This effect was explained by the different informative value of questions in comparison to affirmative sentences and the potential possibility of assimilation of new information if it contradicts prior knowledge. In Study 2 we further investigated the obtained effect, showing that not only questions but also other kinds of incomplete information are subject to the same dependency. Our results support the expectation that, in face of incomplete information, motivation toward closure may be fulfilled efficiently by focusing on expectancy-inconsistent pieces of data. We discuss the obtained effect in the context of previous assumptions that high need for closure results in a simple processing style, advocating a more complex approach based on the character of the provided information.

  10. Toward Reducing Uncertainties in Biospheric Carbon Uptake in the American West: An Atmospheric Perspective

    Science.gov (United States)

    Lin, J. C.; Stephens, B. B.; Mallia, D.; Wu, D.; Jacobson, A. R.

    2015-12-01

    Despite the need for an understanding of terrestrial biospheric carbon fluxes to account for carbon cycle feedbacks and predict future CO2 concentrations, knowledge of such fluxes at the regional scale remains poor. This is particularly true in mountainous areas, where lack of observations combined with difficulties in their interpretation lead to significant uncertainties. Yet mountainous regions are also where significant forest cover and biomass are found—areas that have the potential to serve as carbon sinks. In particular, understanding carbon fluxes in the American West is of critical importance for the U.S. carbon budget, as the large area and biomass indicate potential for carbon sequestration. However, disturbances such as drought, insect outbreak, and wildfires in this region can introduce significant perturbations to the carbon cycle and thereby affect the amount of carbon sequestered by vegetation in the Rockies. To date, there have been few atmospheric CO2 observations in the American Rockies due to a combination of difficulties associated with logistics and interpretation of the measurements in the midst of complex terrain. Among the few sites are those associated with NCAR's Regional Atmospheric Continuous CO2 Network in the Rocky Mountains (Rocky RACCOON). As CO2 observations in mountainous areas increase in the future, it is imperative that they can be properly interpreted to yield information about biospheric carbon fluxes. In this paper, we will present CO2 observations from RACCOON, along with atmospheric simulations that attempt to extract information about biospheric carbon fluxes in the Western U.S. from these observations. We show that atmospheric models can significantly misinterpret the CO2 observations, leading to large errors in the retrieved biospheric fluxes, due to erroneous atmospheric flows. Recommendations for ways to minimize such errors and properly link the CO2 concentrations to biospheric fluxes are discussed.

  11. Reducing uncertainty in personnel dosimetry calculations in the VHTR plant using MAVRIC

    International Nuclear Information System (INIS)

    Flaspoehler, T.; Petrovic, B.

    2013-01-01

    This work analyzes the efficacy of the MAVRIC sequence of the Scale 6.1 code package with respect to the accuracy of results and the ability to utilize large-memory, parallel machines. MAVRIC implements the hybrid FW-CADIS methodology to solve neutron and photon transport for shielding applications. Using the discrete ordinates method to solve the Boltzmann transport equation, an importance map is generated which MAVRIC then uses to bias a stochastic Monte Carlo simulation. The MAVRIC sequence is applied to generate neutron and photon dose rate distributions of improved accuracy in a model of a proposed VHTR power plant. Problems like this one, with a size on the order of magnitude of a nuclear power plant, require a prohibitive amount of memory to store complete importance maps. The issue is addressed by refining the mesh in areas around the source through the detector regions, while leaving a coarse mesh elsewhere. Additionally through the use of parallel computing, the angular flux can be expanded in higher quadrature sets, which leads to a better importance map while requiring no extra memory requirements during the Monte Carlo portion of the sequence. The final Monte Carlo simulations can be run concurrently on several machines with results combined after the fact, emulating parallelism that is not yet available in MAVRIC sequence. Using a combination of strategies, the MAVRIC sequence is shown to be able to scale across available computational resources, allowing the user to more quickly obtain Monte Carlo results with lower relative uncertainties in large, deep-penetration shielding problems. (authors)

  12. The worth of data to reduce predictive uncertainty of an integrated catchment model by multi-constraint calibration

    Science.gov (United States)

    Koch, J.; Jensen, K. H.; Stisen, S.

    2017-12-01

    Hydrological models that integrate numerical process descriptions across compartments of the water cycle are typically required to undergo thorough model calibration in order to estimate suitable effective model parameters. In this study, we apply a spatially distributed hydrological model code which couples the saturated zone with the unsaturated zone and the energy portioning at the land surface. We conduct a comprehensive multi-constraint model calibration against nine independent observational datasets which reflect both the temporal and the spatial behavior of hydrological response of a 1000km2 large catchment in Denmark. The datasets are obtained from satellite remote sensing and in-situ measurements and cover five keystone hydrological variables: discharge, evapotranspiration, groundwater head, soil moisture and land surface temperature. Results indicate that a balanced optimization can be achieved where errors on objective functions for all nine observational datasets can be reduced simultaneously. The applied calibration framework was tailored with focus on improving the spatial pattern performance; however results suggest that the optimization is still more prone to improve the temporal dimension of model performance. This study features a post-calibration linear uncertainty analysis. This allows quantifying parameter identifiability which is the worth of a specific observational dataset to infer values to model parameters through calibration. Furthermore the ability of an observation to reduce predictive uncertainty is assessed as well. Such findings determine concrete implications on the design of model calibration frameworks and, in more general terms, the acquisition of data in hydrological observatories.

  13. Reducing measurement uncertainty drives the use of multiple technologies for supporting metrology

    Science.gov (United States)

    Banke, Bill, Jr.; Archie, Charles N.; Sendelbach, Matthew; Robert, Jim; Slinkman, James A.; Kaszuba, Phil; Kontra, Rick; DeVries, Mick; Solecky, Eric P.

    2004-05-01

    Perhaps never before in semiconductor microlithography has there been such an interest in the accuracy of measurement. This interest places new demands on our in-line metrology systems as well as the supporting metrology for verification. This also puts a burden on the users and suppliers of new measurement tools, which both challenge and complement existing manufacturing metrology. The metrology community needs to respond to these challenges by using new methods to assess the fab metrologies. An important part of this assessment process is the ability to obtain accepted reference measurements as a way of determining the accuracy and Total Measurement Uncertainty (TMU) of an in-line critical dimension (CD). In this paper, CD can mean any critical dimension including, for example, such measures as feature height or sidewall angle. This paper describes the trade-offs of in-line metrology systems as well as the limitations of Reference Measurement Systems (RMS). Many factors influence each application such as feature shape, material properties, proximity, sampling, and critical dimension. These factors, along with the metrology probe size, interaction volume, and probe type such as e-beam, optical beam, and mechanical probe, are considered. As the size of features shrinks below 100nm some of the stalwarts of reference metrology come into question, such as the electrically determined transistor gate length. The concept of the RMS is expanded to show how multiple metrologies are needed to achieve the right balance of accuracy and sampling. This is also demonstrated for manufacturing metrology. Various comparisons of CDSEM, scatterometry, AFM, cross section SEM, electrically determined CDs, and TEM are shown. An example is given which demonstrates the importance in obtaining TMU by balancing accuracy and precision for selecting manufacturing measurement strategy and optimizing manufacturing metrology. It is also demonstrated how the necessary supporting metrology will

  14. How Well Does Fracture Set Characterization Reduce Uncertainty in Capture Zone Size for Wells Situated in Sedimentary Bedrock Aquifers?

    Science.gov (United States)

    West, A. C.; Novakowski, K. S.

    2005-12-01

    Regional groundwater flow models are rife with uncertainty. The three-dimensional flux vector fields must generally be inferred using inverse modelling from sparse measurements of hydraulic head, from measurements of hydraulic parameters at a scale that is miniscule in comparison to that of the domain, and from none to a very few measurements of recharge or discharge rate. Despite the inherent uncertainty in these models they are routinely used to delineate steady-state or time-of-travel capture zones for the purpose of wellhead protection. The latter are defined as the volume of the aquifer within which released particles will arrive at the well within the specified time and their delineation requires the additional step of dividing the magnitudes of the flux vectors by the assumed porosity to arrive at the ``average linear groundwater velocity'' vector field. Since the porosity is usually assumed constant over the domain one could be forgiven for thinking that the uncertainty introduced at this step is minor in comparison to the flow model calibration step. We consider this question when the porosity in question is fracture porosity in flat-lying sedimentary bedrock. We also consider whether or not the diffusive uptake of solute into the rock matrix which lies between the source and the production well reduces or enhances the uncertainty. To evaluate the uncertainty an aquifer cross section is conceptualized as an array of horizontal, randomly-spaced, parallel-plate fractures of random aperture, with adjacent horizontal fractures connected by vertical fractures again of random spacing and aperture. The source is assumed to be a continuous concentration (i.e. a dirichlet boundary condition) representing a leaking tank or a DNAPL pool, and the receptor is a fully pentrating well located in the down-gradient direction. In this context the time-of-travel capture zone is defined as the separation distance required such that the source does not contaminate the well

  15. Financial Risk Ratios and Earnings Management: Reducing Uncertainties in Shariah-compliant Companies

    Directory of Open Access Journals (Sweden)

    Soheil Kazemian

    2018-01-01

    Full Text Available This study examines whether Shariah-compliant companies practice earnings management by investigating the relationship among the risk of financial distress, leverage, and free cash flow in discretionary accruals, which function as a substitute for earnings management. This empirical research is conducted on a sample of Malaysian Shariah-compliant companies from all industries in Bursa Malaysia from 2012 to 2014. Results show that Shariah-compliant companies are highly influenced by the risk of financial distress, leverage, and free cash flow. This study argues that working as either Shariah-compliant or non-Shariah-compliant does not affect the level of earnings management through financial distress, high leverage, and free cash flow by managers. Results should be of interest to stakeholders, shareholders, and regulatory bodies (i.e., the Shariah Advisory Council and the Securities Commission that oversee the accountability of corporate financial reporting to prevent earnings management in Shariah-compliant companies. Findings can also aid relevant authorities (i.e., the Shariah Advisory Council and the Security Commission in Malaysia in overcoming or reducing problems related to earnings management. This study is one of the most significant works in Malaysia in terms of sample size and methodology. It argues that the three elements of earnings management (i.e., financial distress, high leverage, and free cash flow influence better disclosure of reported earnings.

  16. Bookending the Opportunity to Lower Wind’s LCOE by Reducing the Uncertainty Surrounding Annual Energy Production

    Energy Technology Data Exchange (ETDEWEB)

    Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis and Environmental Impacts Div.

    2017-06-01

    Reducing the performance risk surrounding a wind project can potentially lead to a lower weighted-average cost of capital (WACC), and hence a lower levelized cost of energy (LCOE), through an advantageous shift in capital structure, and possibly also a reduction in the cost of capital. Specifically, a reduction in performance risk will move the 1-year P99 annual energy production (AEP) estimate closer to the P50 AEP estimate, which in turn reduces the minimum debt service coverage ratio (DSCR) required by lenders, thereby allowing the project to be financed with a greater proportion of low-cost debt. In addition, a reduction in performance risk might also reduce the cost of one or more of the three sources of capital that are commonly used to finance wind projects: sponsor or cash equity, tax equity, and/or debt. Preliminary internal LBNL analysis of the maximum possible LCOE reduction attainable from reducing the performance risk of a wind project found a potentially significant opportunity for LCOE reduction of ~$10/MWh, by reducing the P50 DSCR to its theoretical minimum value of 1.0 (Bolinger 2015b, 2014) and by reducing the cost of sponsor equity and debt by one-third to one-half each (Bolinger 2015a, 2015b). However, with FY17 funding from the U.S. Department of Energy’s Atmosphere to Electrons (A2e) Performance Risk, Uncertainty, and Finance (PRUF) initiative, LBNL has been revisiting this “bookending” exercise in more depth, and now believes that its earlier preliminary assessment of the LCOE reduction opportunity was overstated. This reassessment is based on two new-found understandings: (1) Due to ever-present and largely irreducible inter-annual variability (IAV) in the wind resource, the minimum required DSCR cannot possibly fall to 1.0 (on a P50 basis), and (2) A reduction in AEP uncertainty will not necessarily lead to a reduction in the cost of capital, meaning that a shift in capital structure is perhaps the best that can be expected (perhaps

  17. A conservation paradox in the Great Basin—Altering sagebrush landscapes with fuel breaks to reduce habitat loss from wildfire

    Science.gov (United States)

    Shinneman, Douglas J.; Aldridge, Cameron L.; Coates, Peter S.; Germino, Matthew J.; Pilliod, David S.; Vaillant, Nicole M.

    2018-03-15

    Interactions between fire and nonnative, annual plant species (that is, “the grass/fire cycle”) represent one of the greatest threats to sagebrush (Artemisia spp.) ecosystems and associated wildlife, including the greater sage-grouse (Centrocercus urophasianus). In 2015, U.S. Department of the Interior called for a “science-based strategy to reduce the threat of large-scale rangeland fire to habitat for the greater sage-grouse and the sagebrush-steppe ecosystem.” An associated guidance document, the “Integrated Rangeland Fire Management Strategy Actionable Science Plan,” identified fuel breaks as high priority areas for scientific research. Fuel breaks are intended to reduce fire size and frequency, and potentially they can compartmentalize wildfire spatial distribution in a landscape. Fuel breaks are designed to reduce flame length, fireline intensity, and rates of fire spread in order to enhance firefighter access, improve response times, and provide safe and strategic anchor points for wildland fire-fighting activities. To accomplish these objectives, fuel breaks disrupt fuel continuity, reduce fuel accumulation, and (or) increase plants with high moisture content through the removal or modification of vegetation in strategically placed strips or blocks of land.Fuel breaks are being newly constructed, enhanced, or proposed across large areas of the Great Basin to reduce wildfire risk and to protect remaining sagebrush ecosystems (including greater sage-grouse habitat). These projects are likely to result in thousands of linear miles of fuel breaks that will have direct ecological effects across hundreds of thousands of acres through habitat loss and conversion. These projects may also affect millions of acres indirectly because of edge effects and habitat fragmentation created by networks of fuel breaks. Hence, land managers are often faced with a potentially paradoxical situation: the need to substantially alter sagebrush habitats with fuel breaks

  18. A new optimization framework using genetic algorithm and artificial neural network to reduce uncertainties in petroleum reservoir models

    Science.gov (United States)

    Maschio, Célio; José Schiozer, Denis

    2015-01-01

    In this article, a new optimization framework to reduce uncertainties in petroleum reservoir attributes using artificial intelligence techniques (neural network and genetic algorithm) is proposed. Instead of using the deterministic values of the reservoir properties, as in a conventional process, the parameters of the probability density function of each uncertain attribute are set as design variables in an optimization process using a genetic algorithm. The objective function (OF) is based on the misfit of a set of models, sampled from the probability density function, and a symmetry factor (which represents the distribution of curves around the history) is used as weight in the OF. Artificial neural networks are trained to represent the production curves of each well and the proxy models generated are used to evaluate the OF in the optimization process. The proposed method was applied to a reservoir with 16 uncertain attributes and promising results were obtained.

  19. Enhanced rice production but greatly reduced carbon emission following biochar amendment in a metal-polluted rice paddy.

    Science.gov (United States)

    Zhang, Afeng; Bian, Rongjun; Li, Lianqing; Wang, Xudong; Zhao, Ying; Hussain, Qaiser; Pan, Genxing

    2015-12-01

    Soil amendment of biochar (BSA) had been shown effective for mitigating greenhouse gas (GHG) emission and alleviating metal stress to plants and microbes in soil. It has not yet been addressed if biochar exerts synergy effects on crop production, GHG emission, and microbial activity in metal-polluted soils. In a field experiment, biochar was amended at sequential rates at 0, 10, 20, and 40 t ha(-1), respectively, in a cadmium- and lead-contaminated rice paddy from the Tai lake Plain, China, before rice cropping in 2010. Fluxes of soil carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) were monitored using a static chamber method during the whole rice growing season (WRGS) of 2011. BSA significantly reduced soil CaCl2 extractable pool of Cd, and DTPA extractable pool of Cd and Pb. As compared to control, soil CO2 emission under BSA was observed to have no change at 10 t ha(-1) but decreased by 16-24% at 20 and 40 t ha(-1). In a similar trend, BSA at 20 and 40 t ha(-1) increased rice yield by 25-26% and thus enhanced ecosystem CO2 sequestration by 47-55% over the control. Seasonal total N2O emission was reduced by 7.1, 30.7, and 48.6% under BSA at 10, 20, and 40 t ha(-1), respectively. Overall, a net reduction in greenhouse gas balance (NGHGB) by 53.9-62.8% and in greenhouse gas intensity (GHGI) by 14.3-28.6% was observed following BSA at 20 and 40 t ha(-1). The present study suggested a great potential of biochar to enhancing grain yield while reducing carbon emission in metal-polluted rice paddies.

  20. Modelling small groundwater systems - the role of targeted field investigations and observational data in reducing model uncertainty

    Science.gov (United States)

    Abesser, Corinna; Hughes, Andrew; Boon, David

    2017-04-01

    the fit between predicted and observed heads and reduction in overall model uncertainty. The impact of availability of observational data on model calibration was tested as part of this study, confirming that equifinality remains an issue despite improved system characterisation and suggesting that uncertainty relating to the distribution of hydraulic conductivity (K) within the dune system must be further reduced. This study illustrates that groundwater modelling is not linear but should be an iterative process, especially in systems where large geological uncertainties exist. It should be carried out in conjunction with field studies, i.e. not as a postscript, but as ongoing interaction. This interaction is required throughout the investigation process and is key to heuristic learning and improved system understanding. Given that the role of modelling is to raise questions as well as answer them, this study demonstrates that this applies even in small systems that are thought to be well understood. This research is funded by the UK Natural Environmental Research Council (NERC). The work is distributed under the Creative Commons Attribution 3.0 Unported License together with an author copyright. This licence does not conflict with the regulations of the Crown Copyright.

  1. Preconcentration of a low-grade uranium ore yielding tailings of greatly reduced environmental concerns. Part V

    International Nuclear Information System (INIS)

    Raicevic, D.; Raicevic, M.

    1980-11-01

    The low-grade ore sample used for this investigation contained 0.057 percent uranium with uranothorite as the major uranium-bearing mineral and a small amount of brannerite, occurring in the quartz-sericite matrix of a conglomerate. The preconcentration procedures, consisting of pyrite flotation with or without flotation of radioactive minerals, followed by high intensity wet magnetic treatment of the sized flotation tailings, produced pyrite and radioactive concentrates of acceptable uranium grades ranging from 0.1 to 0.135 percent uranium. The combined concentrates comprised 37 to 49 percent of the ore by weight with the following combined recoveries: 95.6 to 97.9 percent of the uranium; 94.7 to 96.3 percent of the radium; 97.8 to 99.3 percent of the thorium over 98 percent of the pyrite. The preconcentration tailings produced comprised between 51 and 63 percent of the ore by weight and contained from: 0.0022 to 0.0037 percent U; 12 to 17 pCi/g Ra; 0.002 to 0.004 percent Th less than 0.03 percent S. Because these tailings are practically pyrite-free, they should not generate acidic conditions. Due to their low radium content, their radionuclide hazards are greatly reduced. These preconcentration tailings therefore, could be suitable for surface disposal, mine backfill, revegetation or other uses

  2. Reducing uncertainty for estimating forest carbon stocks and dynamics using integrated remote sensing, forest inventory and process-based modeling

    Science.gov (United States)

    Poulter, B.; Ciais, P.; Joetzjer, E.; Maignan, F.; Luyssaert, S.; Barichivich, J.

    2015-12-01

    Accurately estimating forest biomass and forest carbon dynamics requires new integrated remote sensing, forest inventory, and carbon cycle modeling approaches. Presently, there is an increasing and urgent need to reduce forest biomass uncertainty in order to meet the requirements of carbon mitigation treaties, such as Reducing Emissions from Deforestation and forest Degradation (REDD+). Here we describe a new parameterization and assimilation methodology used to estimate tropical forest biomass using the ORCHIDEE-CAN dynamic global vegetation model. ORCHIDEE-CAN simulates carbon uptake and allocation to individual trees using a mechanistic representation of photosynthesis, respiration and other first-order processes. The model is first parameterized using forest inventory data to constrain background mortality rates, i.e., self-thinning, and productivity. Satellite remote sensing data for forest structure, i.e., canopy height, is used to constrain simulated forest stand conditions using a look-up table approach to match canopy height distributions. The resulting forest biomass estimates are provided for spatial grids that match REDD+ project boundaries and aim to provide carbon estimates for the criteria described in the IPCC Good Practice Guidelines Tier 3 category. With the increasing availability of forest structure variables derived from high-resolution LIDAR, RADAR, and optical imagery, new methodologies and applications with process-based carbon cycle models are becoming more readily available to inform land management.

  3. Climate change impacts on groundwater hydrology – where are the main uncertainties and can they be reduced?

    DEFF Research Database (Denmark)

    Refsgaard, Jens C.; Sonnenborg, Torben; Butts, Michael

    2016-01-01

    This paper assesses how various sources of uncertainty propagate through the uncertainty cascade from emission scenarios through climate models and hydrological models to impacts with particular focus on groundwater aspects for a number of coordinated studies in Denmark. We find results similar...... to surface water studies showing that climate model uncertainty dominates for projections of climate change impacts on streamflow and groundwater heads. However, we find uncertainties related to geological conceptualisation and hydrological model discretisation to be dominating for projections of well field...... climate-hydrology models....

  4. It's the parameters, stupid! Moving beyond multi-model and multi-physics approaches to characterize and reduce predictive uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, Martyn; Samaniego, Luis; Freer, Jim

    2014-05-01

    Multi-model and multi-physics approaches are a popular tool in environmental modelling, with many studies focusing on optimally combining output from multiple model simulations to reduce predictive errors and better characterize predictive uncertainty. However, a careful and systematic analysis of different hydrological models reveals that individual models are simply small permutations of a master modeling template, and inter-model differences are overwhelmed by uncertainty in the choice of the parameter values in the model equations. Furthermore, inter-model differences do not explicitly represent the uncertainty in modeling a given process, leading to many situations where different models provide the wrong results for the same reasons. In other cases, the available morphological data does not support the very fine spatial discretization of the landscape that typifies many modern applications of process-based models. To make the uncertainty characterization problem worse, the uncertain parameter values in process-based models are often fixed (hard-coded), and the models lack the agility necessary to represent the tremendous heterogeneity in natural systems. This presentation summarizes results from a systematic analysis of uncertainty in process-based hydrological models, where we explicitly analyze the myriad of subjective decisions made throughout both the model development and parameter estimation process. Results show that much of the uncertainty is aleatory in nature - given a "complete" representation of dominant hydrologic processes, uncertainty in process parameterizations can be represented using an ensemble of model parameters. Epistemic uncertainty associated with process interactions and scaling behavior is still important, and these uncertainties can be represented using an ensemble of different spatial configurations. Finally, uncertainty in forcing data can be represented using ensemble methods for spatial meteorological analysis. Our systematic

  5. Cellular and molecular research to reduce uncertainties in estimates of health effects from low-level radiation

    Energy Technology Data Exchange (ETDEWEB)

    Elkind, M.M.; Bedford, J.; Benjamin, S.A.; Waldren, C.A. (Colorado State Univ., Fort Collins, CO (USA)); Gotchy, R.L. (Science Applications International Corp., McLean, VA (USA))

    1990-10-01

    A study was undertaken by five radiation scientists to examine the feasibility of reducing the uncertainties in the estimation of risk due to protracted low doses of ionizing radiation. In addressing the question of feasibility, a review was made by the study group: of the cellular, molecular, and mammalian radiation data that are available; of the way in which altered oncogene properties could be involved in the loss of growth control that culminates in tumorigenesis; and of the progress that had been made in the genetic characterizations of several human and animal neoplasms. On the basis of this analysis, the study group concluded that, at the present time, it is feasible to mount a program of radiation research directed at the mechanism(s) of radiation-induced cancer with special reference to risk of neoplasia due to protracted, low doses of sparsely ionizing radiation. To implement a program of research, a review was made of the methods, techniques, and instruments that would be needed. This review was followed by a survey of the laboratories and institutions where scientific personnel and facilities are known to be available. A research agenda of the principal and broad objectives of the program is also discussed. 489 refs., 21 figs., 14 tabs.

  6. Cellular and molecular research to reduce uncertainties in estimates of health effects from low-level radiation

    International Nuclear Information System (INIS)

    Elkind, M.M.; Bedford, J.; Benjamin, S.A.; Waldren, C.A.; Gotchy, R.L.

    1990-10-01

    A study was undertaken by five radiation scientists to examine the feasibility of reducing the uncertainties in the estimation of risk due to protracted low doses of ionizing radiation. In addressing the question of feasibility, a review was made by the study group: of the cellular, molecular, and mammalian radiation data that are available; of the way in which altered oncogene properties could be involved in the loss of growth control that culminates in tumorigenesis; and of the progress that had been made in the genetic characterizations of several human and animal neoplasms. On the basis of this analysis, the study group concluded that, at the present time, it is feasible to mount a program of radiation research directed at the mechanism(s) of radiation-induced cancer with special reference to risk of neoplasia due to protracted, low doses of sparsely ionizing radiation. To implement a program of research, a review was made of the methods, techniques, and instruments that would be needed. This review was followed by a survey of the laboratories and institutions where scientific personnel and facilities are known to be available. A research agenda of the principal and broad objectives of the program is also discussed. 489 refs., 21 figs., 14 tabs

  7. In Depth Modeling of Nuclide Transport in the Geosphere and the Biosphere to Reduce Uncertainty (Final Report)

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Kang, Chul Kyung; Hwang, Yong Soo; Lee, Sung Ho

    2010-08-01

    The Korea Radioactive Waste Management Center (KRMC) is conducting a research on a step by step development of a safety case for the Gyeongju low- and intermediate-level radioactive waste repository (WNEMC; Wolseong Nuclear Environment Management Center). A modeling study and development of a methodology, by which an assessment of safety and performance for a low- and intermediate level radioactive waste (LILW) repository could be effectively made has been carried out. With normal or abnormal nuclide release cases associated with the various FEPs and scenarios involved in the performance of the proposed repository in view of nuclide transport and transfer both in geosphere and biosphere, a total system performance assessment (TSPA) program has been developed by utilizing such commercial development tool programs as GoldSim, AMBER, MASCOT-K, and TOUGH2 in Korea Atomic Energy Research Institute (KAERI) under contract with KRMC. The final project report especially deals much with a detailed conceptual modeling scheme by which a GoldSim program modules, all of which are integrated into a TSPA program template kit as well as the input data set currently available. In-depth system models that are conceptually and rather practically described and then ready for implementing into a GoldSim TSPA program are introduced with plenty of illustrative conceptual schemes and evaluations with data currently available. The GoldSim TSPA tempalte program and the AMBER biosphere tempalte program as well as the TOUGH-2 gas migration template program developed through this project are expected to be successfully applied to the post closure safety assessment required for WNEMC by the regulatory body with increased practicality and much reduced uncertainty and conservatism

  8. Reducing the sensitivity of IMPT treatment plans to setup errors and range uncertainties via probabilistic treatment planning

    International Nuclear Information System (INIS)

    Unkelbach, Jan; Bortfeld, Thomas; Martin, Benjamin C.; Soukup, Martin

    2009-01-01

    Treatment plans optimized for intensity modulated proton therapy (IMPT) may be very sensitive to setup errors and range uncertainties. If these errors are not accounted for during treatment planning, the dose distribution realized in the patient may by strongly degraded compared to the planned dose distribution. The authors implemented the probabilistic approach to incorporate uncertainties directly into the optimization of an intensity modulated treatment plan. Following this approach, the dose distribution depends on a set of random variables which parameterize the uncertainty, as does the objective function used to optimize the treatment plan. The authors optimize the expected value of the objective function. They investigate IMPT treatment planning regarding range uncertainties and setup errors. They demonstrate that incorporating these uncertainties into the optimization yields qualitatively different treatment plans compared to conventional plans which do not account for uncertainty. The sensitivity of an IMPT plan depends on the dose contributions of individual beam directions. Roughly speaking, steep dose gradients in beam direction make treatment plans sensitive to range errors. Steep lateral dose gradients make plans sensitive to setup errors. More robust treatment plans are obtained by redistributing dose among different beam directions. This can be achieved by the probabilistic approach. In contrast, the safety margin approach as widely applied in photon therapy fails in IMPT and is neither suitable for handling range variations nor setup errors.

  9. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the Fuel Rod Analysis Program (FRAP) code and has been applied to a pressurized water reactor (PWR) fuel rod undergoing a loss-of-coolant accident (LOCA). The method of uncertainty analysis is the response surface method. The automated version significantly reduced the time required to complete the analysis and, at the same time, greatly increased the problem scope. Results of the analysis showed a significant difference in the total and relative contributions to the uncertainty of the response parameters between steady state and transient conditions

  10. Reducing uncertainty of estimated nitrogen load reductions to aquatic systems through spatially targeting agricultural mitigation measures using groundwater nitrogen reduction

    DEFF Research Database (Denmark)

    Hashemi, Fatemeh; Olesen, Jørgen Eivind; Jabloun, Mohamed

    2018-01-01

    variation across the landscape in natural N-reduction (denitrification) of leached nitrate in the groundwater and surface water systems. A critical basis for including spatial targeting in regulation of N-load in Denmark is the uncertainty associated with the effect of spatially targeting measures, since......The need to further abate agricultural nitrate (N)-loadings to coastal waters in Denmark represents the main driver for development of a new spatially targeted regulation that focus on locating N-mitigation measures in agricultural areas with high N-load. This targeting makes use of the spatial...... the effect will be critically affected by uncertainty in the quantification of the spatial variation in N-reduction. In this study, we used 30 equally plausible N-reduction maps, at 100 m grid and sub-catchment resolutions, for the 85-km2 groundwater dominated Norsminde catchment in Denmark, applying set...

  11. Reliability ensemble averaging of 21st century projections of terrestrial net primary productivity reduces global and regional uncertainties

    Science.gov (United States)

    Exbrayat, Jean-François; Bloom, A. Anthony; Falloon, Pete; Ito, Akihiko; Smallman, T. Luke; Williams, Mathew

    2018-02-01

    Multi-model averaging techniques provide opportunities to extract additional information from large ensembles of simulations. In particular, present-day model skill can be used to evaluate their potential performance in future climate simulations. Multi-model averaging methods have been used extensively in climate and hydrological sciences, but they have not been used to constrain projected plant productivity responses to climate change, which is a major uncertainty in Earth system modelling. Here, we use three global observationally orientated estimates of current net primary productivity (NPP) to perform a reliability ensemble averaging (REA) method using 30 global simulations of the 21st century change in NPP based on the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) business as usual emissions scenario. We find that the three REA methods support an increase in global NPP by the end of the 21st century (2095-2099) compared to 2001-2005, which is 2-3 % stronger than the ensemble ISIMIP mean value of 24.2 Pg C y-1. Using REA also leads to a 45-68 % reduction in the global uncertainty of 21st century NPP projection, which strengthens confidence in the resilience of the CO2 fertilization effect to climate change. This reduction in uncertainty is especially clear for boreal ecosystems although it may be an artefact due to the lack of representation of nutrient limitations on NPP in most models. Conversely, the large uncertainty that remains on the sign of the response of NPP in semi-arid regions points to the need for better observations and model development in these regions.

  12. Multi-model analysis of terrestrial carbon cycles in Japan: reducing uncertainties in model outputs among different terrestrial biosphere models using flux observations

    Science.gov (United States)

    Ichii, K.; Suzuki, T.; Kato, T.; Ito, A.; Hajima, T.; Ueyama, M.; Sasai, T.; Hirata, R.; Saigusa, N.; Ohtani, Y.; Takagi, K.

    2009-08-01

    Terrestrial biosphere models show large uncertainties when simulating carbon and water cycles, and reducing these uncertainties is a priority for developing more accurate estimates of both terrestrial ecosystem statuses and future climate changes. To reduce uncertainties and improve the understanding of these carbon budgets, we investigated the ability of flux datasets to improve model simulations and reduce variabilities among multi-model outputs of terrestrial biosphere models in Japan. Using 9 terrestrial biosphere models (Support Vector Machine-based regressions, TOPS, CASA, VISIT, Biome-BGC, DAYCENT, SEIB, LPJ, and TRIFFID), we conducted two simulations: (1) point simulations at four flux sites in Japan and (2) spatial simulations for Japan with a default model (based on original settings) and an improved model (based on calibration using flux observations). Generally, models using default model settings showed large deviations in model outputs from observation with large model-by-model variability. However, after we calibrated the model parameters using flux observations (GPP, RE and NEP), most models successfully simulated seasonal variations in the carbon cycle, with less variability among models. We also found that interannual variations in the carbon cycle are mostly consistent among models and observations. Spatial analysis also showed a large reduction in the variability among model outputs, and model calibration using flux observations significantly improved the model outputs. These results show that to reduce uncertainties among terrestrial biosphere models, we need to conduct careful validation and calibration with available flux observations. Flux observation data significantly improved terrestrial biosphere models, not only on a point scale but also on spatial scales.

  13. Isotopic techniques in radioactive waste disposal site evaluation: a method for reducing uncertainties I. T, T/3He, 4He, 14C, 36Cl

    International Nuclear Information System (INIS)

    Muller, A.B.

    1981-01-01

    This paper introduces five of the isotopic techniques which can help reduce uncertainties associated with the assessment of radioactive waste disposal sites. The basic principles and practical considerations of these best known techniques have been presented, showing how much additional site specific information can be acquired at little cost or consequence to containment efficiency. These methods, and the more experimental methods appearing in the figure but not discussed here, should be considered in any detailed site characterization, data collection and analysis

  14. Reducing uncertainty in Climate Response Time Scale by Bayesian Analysis of the 8.2 ka event

    Science.gov (United States)

    Lorenz, A.; Held, H.; Bauer, E.; Schneider von Deimling, T.

    2009-04-01

    We analyze the possibility of uncertainty reduction in Climate Response Time Scale by utilizing Greenland ice-core data that contain the 8.2 ka event within a Bayesian model-data intercomparison with the Earth system model of intermediate complexity, CLIMBER-2.3. Within a stochastic version of the model it has been possible to mimic the 8.2 ka event within a plausible experimental setting and with relatively good accuracy considering the timing of the event in comparison to other modeling exercises [1]. The simulation of the centennial cold event is effectively determined by the oceanic cooling rate which depends largely on the ocean diffusivity described by diffusion coefficients of relatively wide uncertainty ranges. The idea now is to discriminate between the different values of diffusivities according to their likelihood to rightly represent the duration of the 8.2 ka event and thus to exploit the paleo data to constrain uncertainty in model parameters in analogue to [2]. Implementing this inverse Bayesian Analysis with this model the technical difficulty arises to establish the related likelihood numerically in addition to the uncertain model parameters: While mainstream uncertainty analyses can assume a quasi-Gaussian shape of likelihood, with weather fluctuating around a long term mean, the 8.2 ka event as a highly nonlinear effect precludes such an a priori assumption. As a result of this study [3] the Bayesian Analysis showed a reduction of uncertainty in vertical ocean diffusivity parameters of factor 2 compared to prior knowledge. This learning effect on the model parameters is propagated to other model outputs of interest; e.g. the inverse ocean heat capacity, which is important for the dominant time scale of climate response to anthropogenic forcing which, in combination with climate sensitivity, strongly influences the climate systems reaction for the near- and medium-term future. 1 References [1] E. Bauer, A. Ganopolski, M. Montoya: Simulation of the

  15. Land-use change reduces habitat suitability for supporting managed honey bee colonies in the Northern Great Plains

    Science.gov (United States)

    Otto, Clint R.; Roth, Cali; Carlson, Benjamin; Smart, Matthew

    2016-01-01

    Human reliance on insect pollination services continues to increase even as pollinator populations exhibit global declines. Increased commodity crop prices and federal subsidies for biofuel crops, such as corn and soybeans, have contributed to rapid land-use change in the US Northern Great Plains (NGP), changes that may jeopardize habitat for honey bees in a part of the country that supports >40% of the US colony stock. We investigated changes in biofuel crop production and grassland land covers surrounding ∼18,000 registered commercial apiaries in North and South Dakota from 2006 to 2014. We then developed habitat selection models to identify remotely sensed land-cover and land-use features that influence apiary site selection by Dakota beekeepers. Our study demonstrates a continual increase in biofuel crops, totaling 1.2 Mha, around registered apiary locations in North and South Dakota. Such crops were avoided by commercial beekeepers when selecting apiary sites in this region. Furthermore, our analysis reveals how grasslands that beekeepers target when selecting commercial apiary locations are becoming less common in eastern North and South Dakota, changes that may have lasting impact on pollinator conservation efforts. Our study highlights how land-use change in the NGP is altering the landscape in ways that are seemingly less conducive to beekeeping. Our models can be used to guide future conservation efforts highlighted in the US national pollinator health strategy by identifying areas that support high densities of commercial apiaries and that have exhibited significant land-use changes.

  16. Assessing community values for reducing agricultural emissions to improve water quality and protect coral health in the Great Barrier Reef

    Science.gov (United States)

    Rolfe, John; Windle, Jill

    2011-12-01

    Policymakers wanting to increase protection of the Great Barrier Reef from pollutants generated by agriculture need to identify when measures to improve water quality generate benefits to society that outweigh the costs involved. The research reported in this paper makes a contribution in several ways. First, it uses the improved science understanding about the links between management changes and reef health to bring together the analysis of costs and benefits of marginal changes, helping to demonstrate the appropriate way of addressing policy questions relating to reef protection. Second, it uses the scientific relationships to frame a choice experiment to value the benefits of improved reef health, with the results of mixed logit (random parameter) models linking improvements explicitly to changes in "water quality units." Third, the research demonstrates how protection values are consistent across a broader population, with some limited evidence of distance effects. Fourth, the information on marginal costs and benefits that are reported provide policymakers with information to help improve management decisions. The results indicate that while there is potential for water quality improvements to generate net benefits, high cost water quality improvements are generally uneconomic. A major policy implication is that cost thresholds for key pollutants should be set to avoid more expensive water quality proposals being selected.

  17. Bootstrap-after-bootstrap model averaging for reducing model uncertainty in model selection for air pollution mortality studies.

    Science.gov (United States)

    Roberts, Steven; Martin, Michael A

    2010-01-01

    Concerns have been raised about findings of associations between particulate matter (PM) air pollution and mortality that have been based on a single "best" model arising from a model selection procedure, because such a strategy may ignore model uncertainty inherently involved in searching through a set of candidate models to find the best model. Model averaging has been proposed as a method of allowing for model uncertainty in this context. To propose an extension (double BOOT) to a previously described bootstrap model-averaging procedure (BOOT) for use in time series studies of the association between PM and mortality. We compared double BOOT and BOOT with Bayesian model averaging (BMA) and a standard method of model selection [standard Akaike's information criterion (AIC)]. Actual time series data from the United States are used to conduct a simulation study to compare and contrast the performance of double BOOT, BOOT, BMA, and standard AIC. Double BOOT produced estimates of the effect of PM on mortality that have had smaller root mean squared error than did those produced by BOOT, BMA, and standard AIC. This performance boost resulted from estimates produced by double BOOT having smaller variance than those produced by BOOT and BMA. Double BOOT is a viable alternative to BOOT and BMA for producing estimates of the mortality effect of PM.

  18. Reducing uncertainty in load forecasts and using real options for improving capacity dispatch management through the utilization of weather and hydrologic forecasts

    International Nuclear Information System (INIS)

    Davis, T.

    2004-01-01

    The effect of weather on electricity markets was discussed with particular focus on reducing weather uncertainty by improving short term weather forecasts. The implications of weather for hydroelectric power dispatch and use were also discussed. Although some errors in weather forecasting can result in economic benefits, most errors are associated with more costs than benefits. This presentation described how a real options analysis can make weather a favorable option. Four case studies were presented for exploratory data analysis of regional weather phenomena. These included: (1) the 2001 California electricity crisis, (2) the delta breeze effects on the California ISO, (3) the summer 2002 weather forecast error for ISO New England, and (4) the hydro plant asset valuation using weather uncertainty. It was concluded that there is a need for more economic methodological studies on the effect of weather on energy markets and costs. It was suggested that the real options theory should be applied to weather planning and utility applications. tabs., figs

  19. Low Amount of Salinomycin Greatly Increases Akt Activation, but Reduces Activated p70S6K Levels

    Directory of Open Access Journals (Sweden)

    Sungpil Yoon

    2013-08-01

    Full Text Available The present study identified a novel salinomycin (Sal-sensitization mechanism in cancer cells. We analyzed the signal proteins Akt, Jnk, p38, Jak, and Erk1/2 in cancer cell lines that had arrested growth following low amounts of Sal treatment. We also tested the signal molecules PI3K, PDK1, GSK3β, p70S6K, mTOR, and PTEN to analyze the PI3K/Akt/mTOR pathway. The results showed that Sal sensitization positively correlates with large reductions in p70S6K activation. Interestingly, Akt was the only signal protein to be significantly activated by Sal treatment. The Akt activation appeared to require the PI3K pathway as its activation was abolished by the PI3K inhibitors LY294002 and wortmannin. The Akt activation by Sal was conserved in the other cell lines analyzed, which originated from other organs. Both Akt activation and C-PARP production were proportionally increased with increased doses of Sal. In addition, the increased levels of pAkt were not reduced over the time course of the experiment. Co-treatment with Akt inhibitors sensitized the Sal-treated cancer cells. The results thereby suggest that Akt activation is increased in cells that survive Sal treatment and resist the cytotoxic effect of Sal. Taken together; these results indicate that Akt activation may promote the resistance of cancer cells to Sal.

  20. Reducing regional drought vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    Science.gov (United States)

    Trindade, B. C.; Reed, P. M.; Herman, J. D.; Zeff, H. B.; Characklis, G. W.

    2017-06-01

    Emerging water scarcity concerns in many urban regions are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative drought management strategies. Our results show that appropriately designing adaptive risk-of-failure action triggers required stressing them with a comprehensive sample of deeply uncertain factors in the computational search phase of MORDM. Search under the new ensemble of states-of-the-world is shown to fundamentally change perceived performance tradeoffs and substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Search under deep uncertainty enhanced the discovery of how cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be employed jointly to improve regional robustness and decrease robustness conflicts between the utilities. Insights from this work have general merit for regions where

  1. Characterization of XR-RV3 GafChromic{sup ®} films in standard laboratory and in clinical conditions and means to evaluate uncertainties and reduce errors

    Energy Technology Data Exchange (ETDEWEB)

    Farah, J., E-mail: jad.farah@irsn.fr; Clairand, I.; Huet, C. [External Dosimetry Department, Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP-17, 92260 Fontenay-aux-Roses (France); Trianni, A. [Medical Physics Department, Udine University Hospital S. Maria della Misericordia (AOUD), p.le S. Maria della Misericordia, 15, 33100 Udine (Italy); Ciraj-Bjelac, O. [Vinca Institute of Nuclear Sciences (VINCA), P.O. Box 522, 11001 Belgrade (Serbia); De Angelis, C. [Department of Technology and Health, Istituto Superiore di Sanità (ISS), Viale Regina Elena 299, 00161 Rome (Italy); Delle Canne, S. [Fatebenefratelli San Giovanni Calibita Hospital (FBF), UOC Medical Physics - Isola Tiberina, 00186 Rome (Italy); Hadid, L.; Waryn, M. J. [Radiology Department, Hôpital Jean Verdier (HJV), Avenue du 14 Juillet, 93140 Bondy Cedex (France); Jarvinen, H.; Siiskonen, T. [Radiation and Nuclear Safety Authority (STUK), P.O. Box 14, 00881 Helsinki (Finland); Negri, A. [Veneto Institute of Oncology (IOV), Via Gattamelata 64, 35124 Padova (Italy); Novák, L. [National Radiation Protection Institute (NRPI), Bartoškova 28, 140 00 Prague 4 (Czech Republic); Pinto, M. [Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (ENEA-INMRI), C.R. Casaccia, Via Anguillarese 301, I-00123 Santa Maria di Galeria (RM) (Italy); Knežević, Ž. [Ruđer Bošković Institute (RBI), Bijenička c. 54, 10000 Zagreb (Croatia)

    2015-07-15

    Purpose: To investigate the optimal use of XR-RV3 GafChromic{sup ®} films to assess patient skin dose in interventional radiology while addressing the means to reduce uncertainties in dose assessment. Methods: XR-Type R GafChromic films have been shown to represent the most efficient and suitable solution to determine patient skin dose in interventional procedures. As film dosimetry can be associated with high uncertainty, this paper presents the EURADOS WG 12 initiative to carry out a comprehensive study of film characteristics with a multisite approach. The considered sources of uncertainties include scanner, film, and fitting-related errors. The work focused on studying film behavior with clinical high-dose-rate pulsed beams (previously unavailable in the literature) together with reference standard laboratory beams. Results: First, the performance analysis of six different scanner models has shown that scan uniformity perpendicular to the lamp motion axis and that long term stability are the main sources of scanner-related uncertainties. These could induce errors of up to 7% on the film readings unless regularly checked and corrected. Typically, scan uniformity correction matrices and reading normalization to the scanner-specific and daily background reading should be done. In addition, the analysis on multiple film batches has shown that XR-RV3 films have generally good uniformity within one batch (<1.5%), require 24 h to stabilize after the irradiation and their response is roughly independent of dose rate (<5%). However, XR-RV3 films showed large variations (up to 15%) with radiation quality both in standard laboratory and in clinical conditions. As such, and prior to conducting patient skin dose measurements, it is mandatory to choose the appropriate calibration beam quality depending on the characteristics of the x-ray systems that will be used clinically. In addition, yellow side film irradiations should be preferentially used since they showed a lower

  2. Reducing regional vulnerabilities and multi-city robustness conflicts using many-objective optimization under deep uncertainty

    Science.gov (United States)

    Reed, Patrick; Trindade, Bernardo; Jonathan, Herman; Harrison, Zeff; Gregory, Characklis

    2016-04-01

    Emerging water scarcity concerns in southeastern US are associated with several deeply uncertain factors, including rapid population growth, limited coordination across adjacent municipalities and the increasing risks for sustained regional droughts. Managing these uncertainties will require that regional water utilities identify regionally coordinated, scarcity-mitigating strategies that trigger the appropriate actions needed to avoid water shortages and financial instabilities. This research focuses on the Research Triangle area of North Carolina, seeking to engage the water utilities within Raleigh, Durham, Cary and Chapel Hill in cooperative and robust regional water portfolio planning. Prior analysis of this region through the year 2025 has identified significant regional vulnerabilities to volumetric shortfalls and financial losses. Moreover, efforts to maximize the individual robustness of any of the mentioned utilities also have the potential to strongly degrade the robustness of the others. This research advances a multi-stakeholder Many-Objective Robust Decision Making (MORDM) framework to better account for deeply uncertain factors when identifying cooperative management strategies. Results show that the sampling of deeply uncertain factors in the computational search phase of MORDM can aid in the discovery of management actions that substantially improve the robustness of individual utilities as well as the overall region to water scarcity. Cooperative water transfers, financial risk mitigation tools, and coordinated regional demand management must be explored jointly to decrease robustness conflicts between the utilities. The insights from this work have general merit for regions where adjacent municipalities can benefit from cooperative regional water portfolio planning.

  3. Reducing Uncertainty in the Daycent Model of Heterotrophic Respiration with a More Mechanistic Representation of Microbial Processes.

    Science.gov (United States)

    Berardi, D.; Gomez-Casanovas, N.; Hudiburg, T. W.

    2017-12-01

    Improving the certainty of ecosystem models is essential to ensuring their legitimacy, value, and ability to inform management and policy decisions. With more than a century of research exploring the variables controlling soil respiration, a high level of uncertainty remains in the ability of ecosystem models to accurately estimate respiration with changing climatic conditions. Refining model estimates of soil carbon fluxes is a high priority for climate change scientists to determine whether soils will be carbon sources or sinks in the future. We found that DayCent underestimates heterotrophic respiration by several magnitudes for our temperate mixed conifer forest site. While traditional ecosystem models simulate decomposition through first order kinetics, recent research has found that including microbial mechanisms explains 20 percent more spatial heterogeneity. We manipulated the DayCent heterotrophic respiration model to include a more mechanistic representation of microbial dynamic and compared the new model with continuous and survey observations from our experimental forest site in the Northern Rockies ecoregion. We also calibrated the model's sensitivity to soil moisture and temperature to our experimental data. We expect to improve the accuracy of the model by 20-30 percent. By using a more representative and calibrated model of soil carbon dynamics, we can better predict feedbacks between climate and soil carbon pools.

  4. Reducing the uncertainty in background marine aerosol radiative properties using CAM5 model results and CALIPSO-retrievals

    Science.gov (United States)

    Meskhidze, N.; Gantt, B.; Dawson, K.; Johnson, M. S.; Gasso, S.

    2012-12-01

    Abundance of natural aerosols in the atmosphere strongly affects global aerosol optical depth (AOD) and influences clouds and the hydrological cycle through its ability to act as cloud condensation nuclei (CCN). Because the anthropogenic contribution to climate forcing represents the difference between the total forcing and that from natural aerosols, understanding background aerosols is necessary to evaluate the influences of anthropogenic aerosols on cloud reflectivity and persistence (so-called indirect radiative forcing). The effects of marine aerosols are explored using remotely sensed data obtained by Cloud-aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) and the NCAR Community Atmosphere Model (CAM5.0), coupled with the PNNL Modal Aerosol Model. CALIPSO-provided high resolution vertical profile information about different aerosol subtypes (defined as clean continental, marine, desert dust, polluted continental, polluted dust, and biomass burning), particulate depolarization ratio (or particle non-sphericity), reported aerosol color ratio (the ratio of aerosol backscatter at the two wavelengths) and lidar ratios over different parts of the oceans are compared to model-simulations to help evaluate the contribution of biogenic aerosol to CCN budget in the marine boundary layer. Model-simulations show that over biologically productive ocean waters primary organic aerosols of marine origin can contribute up to a 20% increase in CCN (at a supersaturation of 0.2%) number concentrations. Corresponding changes associated with cloud properties (liquid water path and droplet number) can decrease global annual mean indirect radiative forcing of anthropogenic aerosol (less cooling) by ~0.1 Wm-2 (7%). This study suggests ignoring the complex chemical composition and size distribution of sea spray particles could result in considerable uncertainties in predicted anthropogenic aerosol indirect effect.

  5. Predictive Modeling of a Paradigm Mechanical Cooling Tower Model: II. Optimal Best-Estimate Results with Reduced Predicted Uncertainties

    Directory of Open Access Journals (Sweden)

    Ruixian Fang

    2016-09-01

    Full Text Available This work uses the adjoint sensitivity model of the counter-flow cooling tower derived in the accompanying PART I to obtain the expressions and relative numerical rankings of the sensitivities, to all model parameters, of the following model responses: (i outlet air temperature; (ii outlet water temperature; (iii outlet water mass flow rate; and (iv air outlet relative humidity. These sensitivities are subsequently used within the “predictive modeling for coupled multi-physics systems” (PM_CMPS methodology to obtain explicit formulas for the predicted optimal nominal values for the model responses and parameters, along with reduced predicted standard deviations for the predicted model parameters and responses. These explicit formulas embody the assimilation of experimental data and the “calibration” of the model’s parameters. The results presented in this work demonstrate that the PM_CMPS methodology reduces the predicted standard deviations to values that are smaller than either the computed or the experimentally measured ones, even for responses (e.g., the outlet water flow rate for which no measurements are available. These improvements stem from the global characteristics of the PM_CMPS methodology, which combines all of the available information simultaneously in phase-space, as opposed to combining it sequentially, as in current data assimilation procedures.

  6. The fourth research co-ordination meeting (RCM) on 'Updated codes and methods to reduce the calculational uncertainties of liquid metal fast reactors reactivity effects'. Working material

    International Nuclear Information System (INIS)

    2003-01-01

    The fourth Research Co-ordination Meeting (RCM) of the Co-ordinated Research Project (CRP) on 'Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effect' was held during 19-23 May, 2003 in Obninsk, Russian Federation. The general objective of the CRP is to validate, verify and improve methodologies and computer codes used for the calculation of reactivity coefficients in fast reactors aiming at enhancing the utilization of plutonium and minor actinides. The first RCM took place in Vienna on 24 - 26 November 1999. The meeting was attended by 19 participants from 7 Member States and one from an international organization (France, Germany, India, Japan, Rep. of Korea, Russian Federation, the United Kingdom, and IAEA). The participants from two Member States (China and the U.S.A.) provided their results and presentation materials even though being absent at the meeting. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN- 600 core were evaluated. Contributions of the participants in the benchmark analyses is shown. This report first addresses the benchmark definitions and specifications given for each Phase and briefly introduces the basic data, computer codes, and methodologies applied to the benchmark analyses by various participants. Then, the results obtained by the participants in terms of calculational uncertainty and their effect on the core transient behavior are intercompared. Finally it addresses some conclusions drawn in the benchmarks

  7. Impact of Reducing Polarimetric SAR Input on the Uncertainty of Crop Classifications Based on the Random Forests Algorithm

    DEFF Research Database (Denmark)

    Loosvelt, Lien; Peters, Jan; Skriver, Henning

    2012-01-01

    Although the use of multidate polarimetric synthetic aperture radar (SAR) data for highly accurate land cover classification has been acknowledged in the literature, the high dimensionality of the data set remains a major issue. This study presents two different strategies to reduce the number...... acquired by the Danish EMISAR on four dates within the period April to July in 1998. The predictive capacity of each feature is analyzed by the importance score generated by random forests (RF). Results show that according to the variation in importance score over time, a distinction can be made between...... general and specific features for crop classification. Based on the importance ranking, features are gradually removed from the single-date data sets in order to construct several multidate data sets with decreasing dimensionality. In the accuracy-oriented and efficiency-oriented reduction, the input...

  8. Unconventional energy resources in a crowded subsurface: Reducing uncertainty and developing a separation zone concept for resource estimation and deep 3D subsurface planning using legacy mining data.

    Science.gov (United States)

    Monaghan, Alison A

    2017-12-01

    Over significant areas of the UK and western Europe, anthropogenic alteration of the subsurface by mining of coal has occurred beneath highly populated areas which are now considering a multiplicity of 'low carbon' unconventional energy resources including shale gas and oil, coal bed methane, geothermal energy and energy storage. To enable decision making on the 3D planning, licensing and extraction of these resources requires reduced uncertainty around complex geology and hydrogeological and geomechanical processes. An exemplar from the Carboniferous of central Scotland, UK, illustrates how, in areas lacking hydrocarbon well production data and 3D seismic surveys, legacy coal mine plans and associated boreholes provide valuable data that can be used to reduce the uncertainty around geometry and faulting of subsurface energy resources. However, legacy coal mines also limit unconventional resource volumes since mines and associated shafts alter the stress and hydrogeochemical state of the subsurface, commonly forming pathways to the surface. To reduce the risk of subsurface connections between energy resources, an example of an adapted methodology is described for shale gas/oil resource estimation to include a vertical separation or 'stand-off' zone between the deepest mine workings, to ensure the hydraulic fracturing required for shale resource production would not intersect legacy coal mines. Whilst the size of such separation zones requires further work, developing the concept of 3D spatial separation and planning is key to utilising the crowded subsurface energy system, whilst mitigating against resource sterilisation and environmental impacts, and could play a role in positively informing public and policy debate. Copyright © 2017 British Geological Survey, a component institute of NERC. Published by Elsevier B.V. All rights reserved.

  9. TH-C-BRD-05: Reducing Proton Beam Range Uncertainty with Patient-Specific CT HU to RSP Calibrations Based On Single-Detector Proton Radiography

    Energy Technology Data Exchange (ETDEWEB)

    Doolan, P [University College London, London (United Kingdom); Massachusetts General Hospital, Boston, MA (United States); Sharp, G; Testa, M; Lu, H-M [Massachusetts General Hospital, Boston, MA (United States); Bentefour, E [Ion Beam Applications (IBA), Louvain la Neuve (Belgium); Royle, G [University College London, London (United Kingdom)

    2014-06-15

    Purpose: Beam range uncertainty in proton treatment comes primarily from converting the patient's X-ray CT (xCT) dataset to relative stopping power (RSP). Current practices use a single curve for this conversion, produced by a stoichiometric calibration based on tissue composition data for average, healthy, adult humans, but not for the individual in question. Proton radiographs produce water-equivalent path length (WEPL) maps, dependent on the RSP of tissues within the specific patient. This work investigates the use of such WEPL maps to optimize patient-specific calibration curves for reducing beam range uncertainty. Methods: The optimization procedure works on the principle of minimizing the difference between the known WEPL map, obtained from a proton radiograph, and a digitally-reconstructed WEPL map (DRWM) through an RSP dataset, by altering the calibration curve that is used to convert the xCT into an RSP dataset. DRWMs were produced with Plastimatch, an in-house developed software, and an optimization procedure was implemented in Matlab. Tests were made on a range of systems including simulated datasets with computed WEPL maps and phantoms (anthropomorphic and real biological tissue) with WEPL maps measured by single detector proton radiography. Results: For the simulated datasets, the optimizer showed excellent results. It was able to either completely eradicate or significantly reduce the root-mean-square-error (RMSE) in the WEPL for the homogeneous phantoms (to zero for individual materials or from 1.5% to 0.2% for the simultaneous optimization of multiple materials). For the heterogeneous phantom the RMSE was reduced from 1.9% to 0.3%. Conclusion: An optimization procedure has been designed to produce patient-specific calibration curves. Test results on a range of systems with different complexities and sizes have been promising for accurate beam range control in patients. This project was funded equally by the Engineering and Physical Sciences

  10. Assessment of the Potential to Reduce Emissions from Road Transportation, Notably NOx, Through the Use of Alternative Vehicles and Fuels in the Great Smoky Mountains Region; TOPICAL

    International Nuclear Information System (INIS)

    Sheffield, J.

    2001-01-01

    Air pollution is a serious problem in the region of the Great Smoky Mountains. The U.S. Environmental Protection Agency (EPA) may designate non-attainment areas by 2003 for ozone. Pollutants include nitrogen oxides (NOx), sulfur dioxide (SO(sub 2)), carbon monoxide (CO), volatile organic compounds (VOCs), lead, and particulate matter (PM), which are health hazards, damage the environment, and limit visibility. The main contributors to this pollution are industry, transportation, and utilities. Reductions from all contributors are needed to correct this problem. While improvements are projected in each sector over the next decades, the May 2000 Interim Report issued by the Southern Appalachian Mountains Initiative (SAMI) suggests that the percentage of NOx emissions from transportation may increase. The conclusions are: (1) It is essential to consider the entire fuel cycle in assessing the benefits, or disadvantages, of an alternative fuel option, i.e., feedstock and fuel production, in addition to vehicle operation; (2) Many improvements to the energy efficiency of a particular vehicle and engine combination will also reduce emissions by reducing fuel use, e.g., engine efficiency, reduced weight, drag and tire friction, and regenerative braking; (3) In reducing emissions it will be important to install the infrastructure to provide the improved fuels, support the maintenance of advanced vehicles, and provide emissions testing of both local vehicles and those from out of state; (4) Public transit systems using lower emission vehicles can play an important role in reducing emissions per passenger mile by carrying passengers more efficiently, particularly in congested areas. However, analysis is required for each situation; (5) Any reduction in emissions will be welcome, but the problems of air pollution in our region will not be solved by a few modest improvements. Substantial reductions in emissions of key pollutants are required both in East Tennessee and in

  11. Reduced uncertainty of regional scale CLM predictions of net carbon fluxes and leaf area indices with estimated plant-specific parameters

    Science.gov (United States)

    Post, Hanna; Hendricks Franssen, Harrie-Jan; Han, Xujun; Baatz, Roland; Montzka, Carsten; Schmidt, Marius; Vereecken, Harry

    2016-04-01

    Reliable estimates of carbon fluxes and states at regional scales are required to reduce uncertainties in regional carbon balance estimates and to support decision making in environmental politics. In this work the Community Land Model version 4.5 (CLM4.5-BGC) was applied at a high spatial resolution (1 km2) for the Rur catchment in western Germany. In order to improve the model-data consistency of net ecosystem exchange (NEE) and leaf area index (LAI) for this study area, five plant functional type (PFT)-specific CLM4.5-BGC parameters were estimated with time series of half-hourly NEE data for one year in 2011/2012, using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm, a Markov Chain Monte Carlo (MCMC) approach. The parameters were estimated separately for four different plant functional types (needleleaf evergreen temperate tree, broadleaf deciduous temperate tree, C3-grass and C3-crop) at four different sites. The four sites are located inside or close to the Rur catchment. We evaluated modeled NEE for one year in 2012/2013 with NEE measured at seven eddy covariance sites in the catchment, including the four parameter estimation sites. Modeled LAI was evaluated by means of LAI derived from remotely sensed RapidEye images of about 18 days in 2011/2012. Performance indices were based on a comparison between measurements and (i) a reference run with CLM default parameters, and (ii) a 60 instance CLM ensemble with parameters sampled from the DREAM posterior probability density functions (pdfs). The difference between the observed and simulated NEE sum reduced 23% if estimated parameters instead of default parameters were used as input. The mean absolute difference between modeled and measured LAI was reduced by 59% on average. Simulated LAI was not only improved in terms of the absolute value but in some cases also in terms of the timing (beginning of vegetation onset), which was directly related to a substantial improvement of the NEE estimates in

  12. Utility of population models to reduce uncertainty and increase value relevance in ecological risk assessments of pesticides: an example based on acute mortality data for daphnids.

    Science.gov (United States)

    Hanson, Niklas; Stark, John D

    2012-04-01

    Traditionally, ecological risk assessments (ERA) of pesticides have been based on risk ratios, where the predicted concentration of the chemical is compared to the concentration that causes biological effects. The concentration that causes biological effect is mostly determined from laboratory experiments using endpoints on the level of the individual (e.g., mortality and reproduction). However, the protection goals are mostly defined at the population level. To deal with the uncertainty in the necessary extrapolations, safety factors are used. Major disadvantages with this simplified approach is that it is difficult to relate a risk ratio to the environmental protection goals, and that the use of fixed safety factors can result in over- as well as underprotective assessments. To reduce uncertainty and increase value relevance in ERA, it has been argued that population models should be used more frequently. In the present study, we have used matrix population models for 3 daphnid species (Ceriodaphnia dubia, Daphnia magna, and D. pulex) to reduce uncertainty and increase value relevance in the ERA of a pesticide (spinosad). The survival rates in the models were reduced in accordance with data from traditional acute mortality tests. As no data on reproductive effects were available, the conservative assumption that no reproduction occurred during the exposure period was made. The models were used to calculate the minimum population size and the time to recovery. These endpoints can be related to the European Union (EU) protection goals for aquatic ecosystems in the vicinity of agricultural fields, which state that reversible population level effects are acceptable if there is recovery within an acceptable (undefined) time frame. The results of the population models were compared to the acceptable (according to EU documents) toxicity exposure ratio (TER) that was based on the same data. At the acceptable TER, which was based on the most sensitive species (C. dubia

  13. Moderate solar geoengineering greatly reduces the largest changes in climate whilst modestly increasing the changes in climate over a small fraction of the Earth

    Science.gov (United States)

    Irvine, P. J.; Keith, D.; He, J.; Vecchi, G.; Horowitz, L. W.

    2017-12-01

    Whilst solar geoengineering reduces global temperature it cannot perfectly offset the climate effects of elevated CO2 concentrations. Solar geoengineering has been shown to have a greater effect on the global hydrological cycle than CO2 and substantial differences in regional precipitation relative to a scenario without elevated CO2­ concentrations have been noted. In this study we evaluate a moderate scenario of solar geoengineering, one which offsets 50% of the forcing from elevated CO2 concentrations, using a 25 Km resolution global climate model and verify these results using the Geoengineering model Intercomparison project ensemble. We calculate the fraction of regions that would be better or worse off after solar geoengineering deployment, defining those which see greater absolute change as worse off and vice versa. We find that 51% of the land area would be statistically significantly better off for precipitation, 33% for Precipitation minus evaporation (P-E), and that less than 3% would be worse off for precipitation, and 1% for P-E. We find that the fraction of the land area experiencing the largest changes in climate, defined as the upper quartile of the CO2 minus control anomaly, is greatly reduced for precipitation, P-E and 5-day maximum precipitation, and eliminated for mean and max annual temperature. The regions which are made worse off in precipitation or P-E by solar geoengineering typically saw relatively little to no CO2 induced climate change and see relatively little to moderate change in the solar geoengineering scenario. There is little overlap between the regions made worse off in terms of precipitation and P-E. In fact, whilst precipitation is reduced in almost all regions made worse off by solar geoengineering, P-E is increased in the majority of regions made worse off. Overall, we find that for each variable considered solar geoengineering greatly reduces the fraction of the world experiencing relatively large change and that those

  14. Reducing uncertainties associated with filter-based optical measurements of light absorbing carbon particles with chemical information

    Science.gov (United States)

    Engström, J. E.; Leck, C.

    2011-08-01

    The presented filter-based optical method for determination of soot (light absorbing carbon or Black Carbon, BC) can be implemented in the field under primitive conditions and at low cost. This enables researchers with small economical means to perform monitoring at remote locations, especially in the Asia where it is much needed. One concern when applying filter-based optical measurements of BC is that they suffer from systematic errors due to the light scattering of non-absorbing particles co-deposited on the filter, such as inorganic salts and mineral dust. In addition to an optical correction of the non-absorbing material this study provides a protocol for correction of light scattering based on the chemical quantification of the material, which is a novelty. A newly designed photometer was implemented to measure light transmission on particle accumulating filters, which includes an additional sensor recording backscattered light. The choice of polycarbonate membrane filters avoided high chemical blank values and reduced errors associated with length of the light path through the filter. Two protocols for corrections were applied to aerosol samples collected at the Maldives Climate Observatory Hanimaadhoo during episodes with either continentally influenced air from the Indian/Arabian subcontinents (winter season) or pristine air from the Southern Indian Ocean (summer monsoon). The two ways of correction (optical and chemical) lowered the particle light absorption of BC by 63 to 61 %, respectively, for data from the Arabian Sea sourced group, resulting in median BC absorption coefficients of 4.2 and 3.5 Mm-1. Corresponding values for the South Indian Ocean data were 69 and 97 % (0.38 and 0.02 Mm-1). A comparison with other studies in the area indicated an overestimation of their BC levels, by up to two orders of magnitude. This raises the necessity for chemical correction protocols on optical filter-based determinations of BC, before even the sign on the

  15. Reducing uncertainties associated with filter-based optical measurements of light absorbing carbon particles with chemical information

    Directory of Open Access Journals (Sweden)

    J. E. Engström

    2011-08-01

    Full Text Available The presented filter-based optical method for determination of soot (light absorbing carbon or Black Carbon, BC can be implemented in the field under primitive conditions and at low cost. This enables researchers with small economical means to perform monitoring at remote locations, especially in the Asia where it is much needed.

    One concern when applying filter-based optical measurements of BC is that they suffer from systematic errors due to the light scattering of non-absorbing particles co-deposited on the filter, such as inorganic salts and mineral dust. In addition to an optical correction of the non-absorbing material this study provides a protocol for correction of light scattering based on the chemical quantification of the material, which is a novelty. A newly designed photometer was implemented to measure light transmission on particle accumulating filters, which includes an additional sensor recording backscattered light. The choice of polycarbonate membrane filters avoided high chemical blank values and reduced errors associated with length of the light path through the filter.

    Two protocols for corrections were applied to aerosol samples collected at the Maldives Climate Observatory Hanimaadhoo during episodes with either continentally influenced air from the Indian/Arabian subcontinents (winter season or pristine air from the Southern Indian Ocean (summer monsoon. The two ways of correction (optical and chemical lowered the particle light absorption of BC by 63 to 61 %, respectively, for data from the Arabian Sea sourced group, resulting in median BC absorption coefficients of 4.2 and 3.5 Mm−1. Corresponding values for the South Indian Ocean data were 69 and 97 % (0.38 and 0.02 Mm−1. A comparison with other studies in the area indicated an overestimation of their BC levels, by up to two orders of magnitude. This raises the necessity for chemical correction protocols on optical filter

  16. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  17. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  18. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  19. On the need for a time- and location-dependent estimation of the NDSI threshold value for reducing existing uncertainties in snow cover maps at different scales

    Science.gov (United States)

    Härer, Stefan; Bernhardt, Matthias; Siebers, Matthias; Schulz, Karsten

    2018-05-01

    Knowledge of current snow cover extent is essential for characterizing energy and moisture fluxes at the Earth's surface. The snow-covered area (SCA) is often estimated by using optical satellite information in combination with the normalized-difference snow index (NDSI). The NDSI thereby uses a threshold for the definition if a satellite pixel is assumed to be snow covered or snow free. The spatiotemporal representativeness of the standard threshold of 0.4 is however questionable at the local scale. Here, we use local snow cover maps derived from ground-based photography to continuously calibrate the NDSI threshold values (NDSIthr) of Landsat satellite images at two European mountain sites of the period from 2010 to 2015. The Research Catchment Zugspitzplatt (RCZ, Germany) and Vernagtferner area (VF, Austria) are both located within a single Landsat scene. Nevertheless, the long-term analysis of the NDSIthr demonstrated that the NDSIthr at these sites are not correlated (r = 0.17) and different than the standard threshold of 0.4. For further comparison, a dynamic and locally optimized NDSI threshold was used as well as another locally optimized literature threshold value (0.7). It was shown that large uncertainties in the prediction of the SCA of up to 24.1 % exist in satellite snow cover maps in cases where the standard threshold of 0.4 is used, but a newly developed calibrated quadratic polynomial model which accounts for seasonal threshold dynamics can reduce this error. The model minimizes the SCA uncertainties at the calibration site VF by 50 % in the evaluation period and was also able to improve the results at RCZ in a significant way. Additionally, a scaling experiment shows that the positive effect of a locally adapted threshold diminishes using a pixel size of 500 m or larger, underlining the general applicability of the standard threshold at larger scales.

  20. Genome, transcriptome and methylome sequencing of a primitively eusocial wasp reveal a greatly reduced DNA methylation system in a social insect.

    Science.gov (United States)

    Standage, Daniel S; Berens, Ali J; Glastad, Karl M; Severin, Andrew J; Brendel, Volker P; Toth, Amy L

    2016-04-01

    Comparative genomics of social insects has been intensely pursued in recent years with the goal of providing insights into the evolution of social behaviour and its underlying genomic and epigenomic basis. However, the comparative approach has been hampered by a paucity of data on some of the most informative social forms (e.g. incipiently and primitively social) and taxa (especially members of the wasp family Vespidae) for studying social evolution. Here, we provide a draft genome of the primitively eusocial model insect Polistes dominula, accompanied by analysis of caste-related transcriptome and methylome sequence data for adult queens and workers. Polistes dominula possesses a fairly typical hymenopteran genome, but shows very low genomewide GC content and some evidence of reduced genome size. We found numerous caste-related differences in gene expression, with evidence that both conserved and novel genes are related to caste differences. Most strikingly, these -omics data reveal a major reduction in one of the major epigenetic mechanisms that has been previously suggested to be important for caste differences in social insects: DNA methylation. Along with a conspicuous loss of a key gene associated with environmentally responsive DNA methylation (the de novo DNA methyltransferase Dnmt3), these wasps have greatly reduced genomewide methylation to almost zero. In addition to providing a valuable resource for comparative analysis of social insect evolution, our integrative -omics data for this important behavioural and evolutionary model system call into question the general importance of DNA methylation in caste differences and evolution in social insects. © 2016 The Authors. Molecular Ecology Published by John Wiley & Sons Ltd.

  1. Great Apes

    Science.gov (United States)

    Sleeman, Jonathan M.; Cerveny, Shannon

    2014-01-01

    Anesthesia of great apes is often necessary to conduct diagnostic analysis, provide therapeutics, facilitate surgical procedures, and enable transport and translocation for conservation purposes. Due to the stress of remote delivery injection of anesthetic agents, recent studies have focused on oral delivery and/or transmucosal absorption of preanesthetic and anesthetic agents. Maintenance of the airway and provision of oxygen is an important aspect of anesthesia in great ape species. The provision of analgesia is an important aspect of the anesthesia protocol for any procedure involving painful stimuli. Opioids and nonsteroidal anti-inflammatory drugs (NSAIDs) are often administered alone, or in combination to provide multi-modal analgesia. There is increasing conservation management of in situ great ape populations, which has resulted in the development of field anesthesia techniques for free-living great apes for the purposes of translocation, reintroduction into the wild, and clinical interventions.

  2. Reducing the uncertainty of parameters controlling seasonal carbon and water fluxes in Chinese forests and its implication for simulated climate sensitivities

    Science.gov (United States)

    Li, Yue; Yang, Hui; Wang, Tao; MacBean, Natasha; Bacour, Cédric; Ciais, Philippe; Zhang, Yiping; Zhou, Guangsheng; Piao, Shilong

    2017-08-01

    Reducing parameter uncertainty of process-based terrestrial ecosystem models (TEMs) is one of the primary targets for accurately estimating carbon budgets and predicting ecosystem responses to climate change. However, parameters in TEMs are rarely constrained by observations from Chinese forest ecosystems, which are important carbon sink over the northern hemispheric land. In this study, eddy covariance data from six forest sites in China are used to optimize parameters of the ORganizing Carbon and Hydrology In Dynamics EcosystEms TEM. The model-data assimilation through parameter optimization largely reduces the prior model errors and improves the simulated seasonal cycle and summer diurnal cycle of net ecosystem exchange, latent heat fluxes, and gross primary production and ecosystem respiration. Climate change experiments based on the optimized model are deployed to indicate that forest net primary production (NPP) is suppressed in response to warming in the southern China but stimulated in the northeastern China. Altered precipitation has an asymmetric impact on forest NPP at sites in water-limited regions, with the optimization-induced reduction in response of NPP to precipitation decline being as large as 61% at a deciduous broadleaf forest site. We find that seasonal optimization alters forest carbon cycle responses to environmental change, with the parameter optimization consistently reducing the simulated positive response of heterotrophic respiration to warming. Evaluations from independent observations suggest that improving model structure still matters most for long-term carbon stock and its changes, in particular, nutrient- and age-related changes of photosynthetic rates, carbon allocation, and tree mortality.

  3. Reducing Parental Uncertainty Around Childhood Cancer: Implementation Decisions and Design Trade-Offs in Developing an Electronic Health Record-Linked Mobile App.

    Science.gov (United States)

    Marsolo, Keith; Shuman, William; Nix, Jeremy; Morrison, Caroline F; Mullins, Larry L; Pai, Ahna Lh

    2017-06-26

    Parents of children newly diagnosed with cancer are confronted with multiple stressors that place them at risk for significant psychological distress. One strategy that has been shown to help reduce uncertainty is the provision of basic information; however, families of newly diagnosed cancer patients are often bombarded with educational material. Technology has the potential to help families manage their informational needs and move towards normalization. The aim of this study was to create a mobile app that pulls together data from both the electronic health record (EHR) and vetted external information resources to provide tailored information to parents of newly diagnosed children as one method to reduce the uncertainty around their child's illness. This app was developed to be used by families in a National Institutes of Health (NIH)-funded randomized controlled trial (RCT) aimed at decreasing uncertainty and the subsequent psychological distress. A 2-phase qualitative study was conducted to elicit the features and content of the mobile app based on the needs and experience of parents of children newly diagnosed with cancer and their providers. Example functions include the ability to view laboratory results, look up appointments, and to access educational material. Educational material was obtained from databases maintained by the National Cancer Institute (NCI) as well as from groups like the Children's Oncology Group (COG) and care teams within Cincinnati Children's Hospital Medical Center (CCHMC). The use of EHR-based Web services was explored to allow data like laboratory results to be retrieved in real-time. The ethnographic design process resulted in a framework that divided the content of the mobile app into the following 4 sections: (1) information about the patient's current treatment and other data from the EHR; (2) educational background material; (3) a calendar to view upcoming appointments at their medical center; and (4) a section where

  4. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  5. Reduced recanalization rates of the great saphenous vein after endovenous laser treatment with increased energy dosing: definition of a threshold for the endovenous fluence equivalent.

    Science.gov (United States)

    Proebstle, Thomas Michael; Moehler, Thomas; Herdemann, Sylvia

    2006-10-01

    Recent reports indicated a correlation between the amount of energy released during endovenous laser treatment (ELT) of the great saphenous vein (GSV) and the success and durability of the procedure. Our objective was to analyze the influence of increased energy dosing on immediate occlusion and recanalization rates after ELT of the GSV. GSVs were treated with either 15 or 30 W of laser power by using a 940-nm diode laser with continuous fiber pullback and tumescent local anesthesia. Patients were followed up prospectively with duplex ultrasonography at day 1 and at 1, 3, 6, and 12 months. A total of 114 GSVs were treated with 15 W, and 149 GSVs were treated with 30 W. The average endovenous fluence equivalents were 12.8 +/- 5.1 J/cm2 and 35.1 +/- 15.6 J/cm2, respectively. GSV occlusion rates according to the method of Kaplan and Meier for the 15- and 30-W groups were 95.6% and 100%, respectively, at day 1, 90.4% and 100% at 3 months, and 82.7% and 97.0% at 12 months after ELT (log-rank; P = .001). An endovenous fluence equivalent exceeding 20 J/cm2 was associated with durable GSV occlusion after 12 months' follow-up, thus suggesting a schedule for dosing of laser energy with respect to the vein diameter. Higher dosing of laser energy shows a 100% immediate success rate and a significantly reduced recanalization rate during 12 months' follow-up.

  6. Great Expectations

    NARCIS (Netherlands)

    Dickens, Charles

    2005-01-01

    One of Dickens's most renowned and enjoyable novels, Great Expectations tells the story of Pip, an orphan boy who wishes to transcend his humble origins and finds himself unexpectedly given the opportunity to live a life of wealth and respectability. Over the course of the tale, in which Pip

  7. Essential information: Uncertainty and optimal control of Ebola outbreaks.

    Science.gov (United States)

    Li, Shou-Li; Bjørnstad, Ottar N; Ferrari, Matthew J; Mummah, Riley; Runge, Michael C; Fonnesbeck, Christopher J; Tildesley, Michael J; Probert, William J M; Shea, Katriona

    2017-05-30

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  8. Essential information: Uncertainty and optimal control of Ebola outbreaks

    Science.gov (United States)

    Li, Shou-Li; Bjornstad, Ottar; Ferrari, Matthew J.; Mummah, Riley; Runge, Michael C.; Fonnesbeck, Christopher J.; Tildesley, Michael J.; Probert, William J. M.; Shea, Katriona

    2017-01-01

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  9. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  10. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  11. Proton Arc Reduces Range Uncertainty Effects and Improves Conformality Compared With Photon Volumetric Modulated Arc Therapy in Stereotactic Body Radiation Therapy for Non-Small Cell Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Seco, Joao, E-mail: jseco@partners.org [Francis H. Burr Proton Therapy Center, Department of Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts (United States); Gu, Guan; Marcelos, Tiago; Kooy, Hanne; Willers, Henning [Francis H. Burr Proton Therapy Center, Department of Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts (United States)

    2013-09-01

    Purpose: To describe, in a setting of non-small cell lung cancer (NSCLC), the theoretical dosimetric advantages of proton arc stereotactic body radiation therapy (SBRT) in which the beam penumbra of a rotating beam is used to reduce the impact of range uncertainties. Methods and Materials: Thirteen patients with early-stage NSCLC treated with proton SBRT underwent repeat planning with photon volumetric modulated arc therapy (Photon-VMAT) and an in-house-developed arc planning approach for both proton passive scattering (Passive-Arc) and intensity modulated proton therapy (IMPT-Arc). An arc was mimicked with a series of beams placed at 10° increments. Tumor and organ at risk doses were compared in the context of high- and low-dose regions, represented by volumes receiving >50% and <50% of the prescription dose, respectively. Results: In the high-dose region, conformality index values are 2.56, 1.91, 1.31, and 1.74, and homogeneity index values are 1.29, 1.22, 1.52, and 1.18, respectively, for 3 proton passive scattered beams, Passive-Arc, IMPT-Arc, and Photon-VMAT. Therefore, proton arc leads to a 30% reduction in the 95% isodose line volume to 3-beam proton plan, sparing surrounding organs, such as lung and chest wall. For chest wall, V30 is reduced from 21 cm{sup 3} (3 proton beams) to 11.5 cm{sup 3}, 12.9 cm{sup 3}, and 8.63 cm{sup 3} (P=.005) for Passive-Arc, IMPT-Arc, and Photon-VMAT, respectively. In the low-dose region, the mean lung dose and V20 of the ipsilateral lung are 5.01 Gy(relative biological effectiveness [RBE]), 4.38 Gy(RBE), 4.91 Gy(RBE), and 5.99 Gy(RBE) and 9.5%, 7.5%, 9.0%, and 10.0%, respectively, for 3-beam, Passive-Arc, IMPT-Arc, and Photon-VMAT, respectively. Conclusions: Stereotactic body radiation therapy with proton arc and Photon-VMAT generate significantly more conformal high-dose volumes than standard proton SBRT, without loss of coverage of the tumor and with significant sparing of nearby organs, such as chest wall. In addition

  12. Great Lakes

    Science.gov (United States)

    Edsall, Thomas A.; Mac, Michael J.; Opler, Paul A.; Puckett Haecker, Catherine E.; Doran, Peter D.

    1998-01-01

    The Great Lakes region, as defined here, includes the Great Lakes and their drainage basins in Minnesota, Wisconsin, Illinois, Indiana, Ohio, Pennsylvania, and New York. The region also includes the portions of Minnesota, Wisconsin, and the 21 northernmost counties of Illinois that lie in the Mississippi River drainage basin, outside the floodplain of the river. The region spans about 9º of latitude and 20º of longitude and lies roughly halfway between the equator and the North Pole in a lowland corridor that extends from the Gulf of Mexico to the Arctic Ocean.The Great Lakes are the most prominent natural feature of the region (Fig. 1). They have a combined surface area of about 245,000 square kilometers and are among the largest, deepest lakes in the world. They are the largest single aggregation of fresh water on the planet (excluding the polar ice caps) and are the only glacial feature on Earth visible from the surface of the moon (The Nature Conservancy 1994a).The Great Lakes moderate the region’s climate, which presently ranges from subarctic in the north to humid continental warm in the south (Fig. 2), reflecting the movement of major weather masses from the north and south (U.S. Department of the Interior 1970; Eichenlaub 1979). The lakes act as heat sinks in summer and heat sources in winter and are major reservoirs that help humidify much of the region. They also create local precipitation belts in areas where air masses are pushed across the lakes by prevailing winds, pick up moisture from the lake surface, and then drop that moisture over land on the other side of the lake. The mean annual frost-free period—a general measure of the growing-season length for plants and some cold-blooded animals—varies from 60 days at higher elevations in the north to 160 days in lakeshore areas in the south. The climate influences the general distribution of wild plants and animals in the region and also influences the activities and distribution of the human

  13. Global warming likely reduces crop yield and water availability of the dryland cropping systems in the U.S. central Great Plains

    Science.gov (United States)

    We investigated impacts of GCM-projected climate change on dryland crop rotations of wheat-fallow and wheat-corn-fallow in the Central Great Plains (Akron in Colorado, USA) using the CERES 4.0 crop modules in RZWQM2. The climate change scenarios for CO2, temperature, and precipitation were produced ...

  14. Value-of-Information Analysis to Reduce Decision Uncertainty Associated with the Choice of Thromboprophylaxis after Total Hip Replacement in the Irish Healthcare Setting.

    LENUS (Irish Health Repository)

    McCullagh, Laura

    2012-06-05

    Background: The National Centre for Pharmacoeconomics, in collaboration with the Health Services Executive, considers the cost effectiveness of all new medicines introduced into Ireland. Health Technology Assessments (HTAs) are conducted in accordance with the existing agreed Irish HTA guidelines. These guidelines do not specify a formal analysis of value of information (VOI). Objective: The aim of this study was to demonstrate the benefits of using VOI analysis in decreasing decision uncertainty and to examine the viability of applying these techniques as part of the formal HTA process for reimbursement purposes within the Irish healthcare system. Method: The evaluation was conducted from the Irish health payer perspective. A lifetime model evaluated the cost effectiveness of rivaroxaban, dabigatran etexilate and enoxaparin sodium for the prophylaxis of venous thromboembolism after total hip replacement. The expected value of perfect information (EVPI) was determined directly from the probabilistic analysis (PSA). Population-level EVPI (PEVPI) was determined by scaling up the EVPI according to the decision incidence. The expected value of perfect parameter information (EVPPI) was calculated for the three model parameter subsets: probabilities, preference weights and direct medical costs. Results: In the base-case analysis, rivaroxaban dominated both dabigatran etexilate and enoxaparin sodium. PSA indicated that rivaroxaban had the highest probability of being the most cost-effective strategy over a threshold range of &U20AC;0-&U20AC;100 000 per QALY. At a threshold of &U20AC;45 000 per QALY, the probability that rivaroxaban was the most cost-effective strategy was 67%. At a threshold of &U20AC;45 000 per QALY, assuming a 10-year decision time horizon, the PEVPI was &U20AC;11.96 million and the direct medical costs subset had the highest EVPPI value (&U20AC;9.00 million at a population level). In order to decrease uncertainty, a more detailed costing

  15. Reducing uncertainty in value-based pricing using evidence development agreements: the case of continuous intraduodenal infusion of levodopa/carbidopa (Duodopa®) in Sweden.

    Science.gov (United States)

    Willis, Michael; Persson, Ulf; Zoellner, York; Gradl, Birgit

    2010-01-01

    unaffected). The manufacturer continued to collect data and to improve the economic model and re-submitted in 2008. New data and the improved model resulted in reduced uncertainty and a lower cost-effectiveness ratio in the range of Swedish kronor (SEK)430,000 per QALY gained in the base-case analysis, ranging up to SEK900,000 in the most conservative sensitivity analysis, resulting in reimbursement being granted. The case of Duodopa® provides excellent insight into VBP reimbursement decision making in combination with CED and ex post review in actual practice. Publicly available decisions document the rigorous, time-consuming process (four iterations were required before a final decision could be reached). The data generated as part of the risk-sharing agreement proved correct the initial decision to grant limited coverage despite lack of economic data. Access was provided to 100 patients while evidence was generated. Economic appraisal differs from clinical assessment, and decision makers benefit from analysis of naturalistic, actual practice data. Despite reviewing the initial trial-based, 'piggy-back' economic analysis, TLV was uncertain of the cost effectiveness in actual practice and deferred a final decision until observational data from the DAPHNE study became available. Second, acceptance of economic modelling and use of temporary reimbursement conditional on additional evidence development provide a mechanism for risk sharing between TLV and manufacturers, which enabled patient access to a drug with proven clinical benefit while necessary evidence to support claims of cost effectiveness could be generated.

  16. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    International Nuclear Information System (INIS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-01-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has

  17. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach

  18. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  19. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  20. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  1. Benchmarking observational uncertainties for hydrology (Invited)

    Science.gov (United States)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has

  2. Maternal exposure to Great Lakes sport-caught fish and dichlorodiphenyl dichloroethylene, but not polychlorinated biphenyls, is associated with reduced birth weight

    International Nuclear Information System (INIS)

    Weisskopf, M.G.; Anderson, H.A.; Hanrahan, L.P.; Kanarek, M.S.; Falk, C.M.; Steenport, D.M.; Draheim, L.A.

    2005-01-01

    Fish consumption may be beneficial for a developing human fetus, but fish may also contain contaminants that could be detrimental. Great Lakes sport-caught fish (GLSCF) are contaminated with polychlorinated biphenyls (PCBs) and dichlorodiphenyl dichloroethylene (DDE), but the effects of these contaminants on birth outcome are not clear. To distinguish potential contaminant effects, we examined (1) whether the decrease over time in contaminant levels in GLSCF is paralleled by an increase in birth weight of children of GLSCF-consuming mothers and (2) the relation between maternal serum concentrations of these contaminants and birth weight. Mothers (n=511) were interviewed from 1993 to 1995, and maternal serum was collected from 1994 to 1995 (n=143). Potential confounders considered were child gender, maternal age at delivery, maternal prepregnancy body mass index, maternal cigarette and alcohol use during pregnancy, maternal education level, maternal parity, and maternal breastfeeding. Children born during 1970-1977, 1978-1984, and 1985-1993 to mothers who ate more than 116 meals of GLSCF before pregnancy were, on average, 164 g lighter, 46 g heavier, and 134 g heavier, respectively, than children of mothers who ate no GLSCF before pregnancy (P trend=0.05). GLSCF-consuming mothers had higher serum PCB and DDE concentrations, but only increased DDE was associated with lower birth weight. The data suggest that fetal DDE exposure (as indicated by maternal serum DDE concentration) may decrease birth weight and that decreased birth weight effects associated with GLSCF consumption have decreased over time

  3. The Effect of Reduced Water Availability in the Great Ruaha River on the Vulnerable Common Hippopotamus in the Ruaha National Park, Tanzania.

    Directory of Open Access Journals (Sweden)

    Claudia Stommel

    Full Text Available In semi-arid environments, 'permanent' rivers are essential sources of surface water for wildlife during 'dry' seasons when rainfall is limited or absent, particularly for species whose resilience to water scarcity is low. The hippopotamus (Hippopotamus amphibius requires submersion in water to aid thermoregulation and prevent skin damage by solar radiation; the largest threat to its viability are human alterations of aquatic habitats. In the Ruaha National Park (NP, Tanzania, the Great Ruaha River (GRR is the main source of surface water for wildlife during the dry season. Recent, large-scale water extraction from the GRR by people upstream of Ruaha NP is thought to be responsible for a profound decrease in dry season water-flow and the absence of surface water along large sections of the river inside the NP. We investigated the impact of decreased water flow on daytime hippo distribution using regular censuses at monitoring locations, transects and camera trap records along a 104km section of the GRR within the Ruaha NP during two dry seasons. The minimum number of hippos per monitoring location increased with the expanse of surface water as the dry seasons progressed, and was not affected by water quality. Hippo distribution significantly changed throughout the dry season, leading to the accumulation of large numbers in very few locations. If surface water loss from the GRR continues to increase in future years, this will have serious implications for the hippo population and other water dependent species in Ruaha NP.

  4. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  5. Observation of quantum-memory-assisted entropic uncertainty relation under open systems, and its steering

    Science.gov (United States)

    Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu

    2018-01-01

    Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.

  6. REDUCING UNCERTAINTIES IN MODEL PREDICTIONS VIA HISTORY MATCHING OF CO2 MIGRATION AND REACTIVE TRANSPORT MODELING OF CO2 FATE AT THE SLEIPNER PROJECT

    Energy Technology Data Exchange (ETDEWEB)

    Zhu, Chen

    2015-03-31

    An important question for the Carbon Capture, Storage, and Utility program is “can we adequately predict the CO2 plume migration?” For tracking CO2 plume development, the Sleipner project in the Norwegian North Sea provides more time-lapse seismic monitoring data than any other sites, but significant uncertainties still exist for some of the reservoir parameters. In Part I, we assessed model uncertainties by applying two multi-phase compositional simulators to the Sleipner Benchmark model for the uppermost layer (Layer 9) of the Utsira Sand and calibrated our model against the time-lapsed seismic monitoring data for the site from 1999 to 2010. Approximate match with the observed plume was achieved by introducing lateral permeability anisotropy, adding CH4 into the CO2 stream, and adjusting the reservoir temperatures. Model-predicted gas saturation, CO2 accumulation thickness, and CO2 solubility in brine—none were used as calibration metrics—were all comparable with the interpretations of the seismic data in the literature. In Part II & III, we evaluated the uncertainties of predicted long-term CO2 fate up to 10,000 years, due to uncertain reaction kinetics. Under four scenarios of the kinetic rate laws, the temporal and spatial evolution of CO2 partitioning into the four trapping mechanisms (hydrodynamic/structural, solubility, residual/capillary, and mineral) was simulated with ToughReact, taking into account the CO2-brine-rock reactions and the multi-phase reactive flow and mass transport. Modeling results show that different rate laws for mineral dissolution and precipitation reactions resulted in different predicted amounts of trapped CO2 by carbonate minerals, with scenarios of the conventional linear rate law for feldspar dissolution having twice as much mineral trapping (21% of the injected CO2) as scenarios with a Burch-type or Alekseyev et al.–type rate law for feldspar dissolution (11%). So far, most reactive transport modeling (RTM) studies for

  7. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  8. Investment and uncertainty

    DEFF Research Database (Denmark)

    Greasley, David; Madsen, Jakob B.

    2006-01-01

    A severe collapse of fixed capital formation distinguished the onset of the Great Depression from other investment downturns between the world wars. Using a model estimated for the years 1890-2000, we show that the expected profitability of capital measured by Tobin's q, and the uncertainty...... surrounding expected profits indicated by share price volatility, were the chief influences on investment levels, and that heightened share price volatility played the dominant role in the crucial investment collapse in 1930. Investment did not simply follow the downward course of income at the onset...

  9. Empirical estimates to reduce modeling uncertainties of soil organic carbon in permafrost regions: a review of recent progress and remaining challenges

    International Nuclear Information System (INIS)

    Mishra, U; Jastrow, J D; Matamala, R; Fan, Z; Miller, R M; Hugelius, G; Kuhry, P; Koven, C D; Riley, W J; Harden, J W; Ping, C L; Michaelson, G J; McGuire, A D; Tarnocai, C; Schaefer, K; Schuur, E A G; Jorgenson, M T; Hinzman, L D

    2013-01-01

    The vast amount of organic carbon (OC) stored in soils of the northern circumpolar permafrost region is a potentially vulnerable component of the global carbon cycle. However, estimates of the quantity, decomposability, and combustibility of OC contained in permafrost-region soils remain highly uncertain, thereby limiting our ability to predict the release of greenhouse gases due to permafrost thawing. Substantial differences exist between empirical and modeling estimates of the quantity and distribution of permafrost-region soil OC, which contribute to large uncertainties in predictions of carbon–climate feedbacks under future warming. Here, we identify research challenges that constrain current assessments of the distribution and potential decomposability of soil OC stocks in the northern permafrost region and suggest priorities for future empirical and modeling studies to address these challenges. (letter)

  10. Low-dose biplanar radiography can be used in children and adolescents to accurately assess femoral and tibial torsion and greatly reduce irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Meyrignac, Olivier; Baunin, Christiane; Vial, Julie; Sans, Nicolas [CHU Toulouse Purpan, Department of Radiology, Toulouse Cedex 9 (France); Moreno, Ramiro [ALARA Expertise, Oberhausbergen (France); Accadbled, Franck; Gauzy, Jerome Sales de [Hopital des Enfants, Department of Orthopedics, Toulouse Cedex 9 (France); Sommet, Agnes [Universite Paul Sabatier, Department of Fundamental Pharmaco-Clinical Pharmacology, Toulouse (France)

    2015-06-01

    To evaluate in children the agreement between femoral and tibial torsion measurements obtained with low-dose biplanar radiography (LDBR) and CT, and to study dose reduction ratio between these two techniques both in vitro and in vivo. Thirty children with lower limb torsion abnormalities were included in a prospective study. Biplanar radiographs and CTs were performed for measurements of lower limb torsion on each patient. Values were compared using Bland-Altman plots. Interreader and intrareader agreements were evaluated by intraclass correlation coefficients. Comparative dosimetric study was performed using an ionization chamber in a tissue-equivalent phantom, and with thermoluminescent dosimeters in 5 patients. Average differences between CT and LDBR measurements were -0.1 ±1.1 for femoral torsion and -0.7 ±1.4 for tibial torsion. Interreader agreement for LDBR measurements was very good for both femoral torsion (FT) (0.81) and tibial torsion (TT) (0.87). Intrareader agreement was excellent for FT (0.97) and TT (0.89). The ratio between CT scan dose and LDBR dose was 22 in vitro (absorbed dose) and 32 in vivo (skin dose). Lower limb torsion measurements obtained with LDBR are comparable to CT measurements in children and adolescents, with a considerably reduced radiation dose. (orig.)

  11. Evaluation of alternative planting strategies to reduce wheat stem sawfly (Hymenoptera: Cephidae) damage to spring wheat in the northern Great Plains.

    Science.gov (United States)

    Beres, B L; Cárcamo, H A; Bremer, E

    2009-12-01

    Wheat, Triticum aestivum L., producers are often reluctant to use solid-stemmed wheat cultivars resistant to wheat stem sawfly, Cephus cinctus Norton (Hymenoptera: Cephidae), due to concerns regarding yield, efficacy or market opportunities. We evaluated the impact of several planting strategies on wheat yield and quality and wheat stem sawfly infestation at two locations over a three-year period. Experimental units consisted of large plots (50 by 200 m) located on commercial farms adjacent to wheat stem sawfly-infested fields. Compared with a monoculture of a hollow-stemmed cultivar ('AC Barrie'), planting a monoculture of a solid-stemmed cultivar ('AC Eatonia') increased yield by an average of 16% (0.4 mg ha(-1)) and increased the grade of wheat by one unit at the two most heavily infested site-years. Planting a 1:1 blend of AC Eatonia and AC Barrie increased yield by an average of 11%, whereas planting 20- or 40-m plot margins to AC Eatonia increased yield by an average of 8%. High wheat stem sawfly pressure limited the effectiveness of using resistant cultivars in field margins because plants were often infested beyond the plot margin, with uniform infestation down the length of the plots at the two most heavily infested site-years. The effectiveness of AC Eatonia to reduce wheat stem sawfly survivorship was modest in this study, probably due to weather-related factors influencing pith expression and to the high abundance of wheat stem sawfly. Greater benefits from planting field margins to resistant cultivars or planting a blend of resistant and susceptible cultivars might be achievable under lower wheat stem sawfly pressure.

  12. Use of a molecular diagnostic test in AFB smear positive tuberculosis suspects greatly reduces time to detection of multidrug resistant tuberculosis.

    Directory of Open Access Journals (Sweden)

    Nestani Tukvadze

    Full Text Available The WHO has recommended the implementation of rapid diagnostic tests to detect and help combat M/XDR tuberculosis (TB. There are limited data on the performance and impact of these tests in field settings.The performance of the commercially available Genotype MTBDRplus molecular assay was compared to conventional methods including AFB smear, culture and drug susceptibility testing (DST using both an absolute concentration method on Löwenstein-Jensen media and broth-based method using the MGIT 960 system. Sputum specimens were obtained from TB suspects in the country of Georgia who received care through the National TB Program.Among 500 AFB smear-positive sputum specimens, 458 (91.6% had both a positive sputum culture for Mycobacterium tuberculosis and a valid MTBDRplus assay result. The MTBDRplus assay detected isoniazid (INH resistance directly from the sputum specimen in 159 (89.8% of 177 specimens and MDR-TB in 109 (95.6% of 114 specimens compared to conventional methods. There was high agreement between the MTBDRplus assay and conventional DST results in detecting MDR-TB (kappa = 0.95, p<0.01. The most prevalent INH resistance mutation was S315T (78% in the katG codon and the most common rifampicin resistance mutation was S531L (68% in the rpoB codon. Among 13 specimens from TB suspects with negative sputum cultures, 7 had a positive MTBDRplus assay (3 with MDR-TB. The time to detection of MDR-TB was significantly less using the MTBDRplus assay (4.2 days compared to the use of standard phenotypic tests (67.3 days with solid media and 21.6 days with broth-based media.Compared to conventional methods, the MTBDRplus assay had high accuracy and significantly reduced time to detection of MDR-TB in an area with high MDR-TB prevalence. The use of rapid molecular diagnostic tests for TB and drug resistance should increase the proportion of patients promptly placed on appropriate therapy.

  13. ''Nature is unknowable''. The idea of uncertainty

    International Nuclear Information System (INIS)

    Crozon, M.

    2000-01-01

    This paper deals with one of the great idea of the twentieth century, the uncertainty principle of Heisenberg. With a philosophical approach the author explains this principle and presents its cultural impacts on mind. (A.L.B.)

  14. Incorporating uncertainty analysis into life cycle estimates of greenhouse gas emissions from biomass production

    International Nuclear Information System (INIS)

    Johnson, David R.; Willis, Henry H.; Curtright, Aimee E.; Samaras, Constantine; Skone, Timothy

    2011-01-01

    Before further investments are made in utilizing biomass as a source of renewable energy, both policy makers and the energy industry need estimates of the net greenhouse gas (GHG) reductions expected from substituting biobased fuels for fossil fuels. Such GHG reductions depend greatly on how the biomass is cultivated, transported, processed, and converted into fuel or electricity. Any policy aiming to reduce GHGs with biomass-based energy must account for uncertainties in emissions at each stage of production, or else it risks yielding marginal reductions, if any, while potentially imposing great costs. This paper provides a framework for incorporating uncertainty analysis specifically into estimates of the life cycle GHG emissions from the production of biomass. We outline the sources of uncertainty, discuss the implications of uncertainty and variability on the limits of life cycle assessment (LCA) models, and provide a guide for practitioners to best practices in modeling these uncertainties. The suite of techniques described herein can be used to improve the understanding and the representation of the uncertainties associated with emissions estimates, thus enabling improved decision making with respect to the use of biomass for energy and fuel production. -- Highlights: → We describe key model, scenario and data uncertainties in LCAs of biobased fuels. → System boundaries and allocation choices should be consistent with study goals. → Scenarios should be designed around policy levers that can be controlled. → We describe a new way to analyze the importance of covariance between inputs.

  15. Reducing Transaction Costs for Energy Efficiency Investments and Analysis of Economic Risk Associated With Building Performance Uncertainties: Small Buildings and Small Portfolios Program

    Energy Technology Data Exchange (ETDEWEB)

    Langner, R.; Hendron, B.; Bonnema, E.

    2014-08-01

    The small buildings and small portfolios (SBSP) sector face a number of barriers that inhibit SBSP owners from adopting energy efficiency solutions. This pilot project focused on overcoming two of the largest barriers to financing energy efficiency in small buildings: disproportionately high transaction costs and unknown or unacceptable risk. Solutions to these barriers can often be at odds, because inexpensive turnkey solutions are often not sufficiently tailored to the unique circumstances of each building, reducing confidence that the expected energy savings will be achieved. To address these barriers, NREL worked with two innovative, forward-thinking lead partners, Michigan Saves and Energi, to develop technical solutions that provide a quick and easy process to encourage energy efficiency investments while managing risk. The pilot project was broken into two stages: the first stage focused on reducing transaction costs, and the second stage focused on reducing performance risk. In the first stage, NREL worked with the non-profit organization, Michigan Saves, to analyze the effects of 8 energy efficiency measures (EEMs) on 81 different baseline small office building models in Holland, Michigan (climate zone 5A). The results of this analysis (totaling over 30,000 cases) are summarized in a simple spreadsheet tool that enables users to easily sort through the results and find appropriate small office EEM packages that meet a particular energy savings threshold and are likely to be cost-effective.

  16. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  17. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    International Nuclear Information System (INIS)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic; Mollicone, Danilo; Federici, Sandro

    2008-01-01

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation

  18. Applying the conservativeness principle to REDD to deal with the uncertainties of the estimates

    Energy Technology Data Exchange (ETDEWEB)

    Grassi, Giacomo; Monni, Suvi; Achard, Frederic [Institute for Environment and Sustainability, Joint Research Centre of the European Commission, I-21020 Ispra (Italy); Mollicone, Danilo [Department of Geography, University of Alcala de Henares, Madrid (Spain); Federici, Sandro

    2008-07-15

    A common paradigm when the reduction of emissions from deforestations is estimated for the purpose of promoting it as a mitigation option in the context of the United Nations Framework Convention on Climate Change (UNFCCC) is that high uncertainties in input data-i.e., area change and C stock change/area-may seriously undermine the credibility of the estimates and therefore of reduced deforestation as a mitigation option. In this paper, we show how a series of concepts and methodological tools-already existing in UNFCCC decisions and IPCC guidance documents-may greatly help to deal with the uncertainties of the estimates of reduced emissions from deforestation.

  19. The Great Recession was not so Great

    NARCIS (Netherlands)

    van Ours, J.C.

    2015-01-01

    The Great Recession is characterized by a GDP-decline that was unprecedented in the past decades. This paper discusses the implications of the Great Recession analyzing labor market data from 20 OECD countries. Comparing the Great Recession with the 1980s recession it is concluded that there is a

  20. The Uncertainty Multiplier and Business Cycles

    OpenAIRE

    Saijo, Hikaru

    2013-01-01

    I study a business cycle model where agents learn about the state of the economy by accumulating capital. During recessions, agents invest less, and this generates noisier estimates of macroeconomic conditions and an increase in uncertainty. The endogenous increase in aggregate uncertainty further reduces economic activity, which in turn leads to more uncertainty, and so on. Thus, through changes in uncertainty, learning gives rise to a multiplier effect that amplifies business cycles. I use ...

  1. Neutronic characteristics simulation of LMFBR of great size

    International Nuclear Information System (INIS)

    Kim, Y.C.

    1987-09-01

    The CONRAD experimental program to be executed on the critical mockup MASURCA in Cadarache and use all the european plutonium stock. The objectives of this program are to reduce the uncertainties on important project parameters such as the reactivity value of control rods, the flux distribution to valid calcul methods and data to use for new LMFBR conception (heterogeneous axial core by example) and to resolve the neutronic control problems for a LMFBR of great size. The present study has permitted to define this program and its physical characteristics [fr

  2. Sci-Fri PM: Radiation Therapy, Planning, Imaging, and Special Techniques - 01: On the use of proton radiography to reduce beam range uncertainties and improve patient positioning accuracy in proton therapy

    Energy Technology Data Exchange (ETDEWEB)

    Collins-Fekete, Charles-Antoine; Beaulieu, Luc; Seco, Joao [Université Laval/ CHU de Québec, Université Laval/CHU de Québec, Massachussetts General Hospital/ Harvard Medical School (United States)

    2016-08-15

    To present two related developments of proton radiography (pRad) to minimize range uncertainty in proton therapy. The first combines a pRad with an X-ray CT to produce a patient-specific relative stopping power (RSP) map. The second aims to improve the pRad spatial resolution for accurate registration prior to the first. The enhanced-pRad can also be used in a novel proton-CT reconstruction algorithm. Monte Carlo pRad were computed from three phantoms; the Gammex, the Catphan and an anthropomorphic head. An optimized cubic-spline estimator derives the most likely path. The length crossed by the protons voxel-by-voxel was calculated by combining their estimated paths with the CT. The difference between the theoretical (length×RSP) and measured energy loss was minimized through a least squares optimization (LSO) algorithm yielding the RSP map. To increase pRad spatial resolution for registration with the CT, the phantom was discretized into voxels columns. The average column RSP was optimized to maximize the proton energy loss likelihood (MLE). Simulations showed precise RSP (<0.75%) for Gammex materials except low-density lung (<1.2%). For the head, accurate RSP were obtained (µ=−0.10%1.5σ=1.12%) and the range precision was improved (ΔR80 of −0.20±0.35%). Spatial resolution was increased in pRad (2.75 to 6.71 lp/cm) and pCT from MLE-enhanced pRad (2.83 to 5.86 lp/cm). The LSO decreases the range uncertainty (R80σ<1.0%) while the MLE-enhanced pRad spatial resolution (+244%) and is a great candidate for pCT reconstruction.

  3. Bio-fuel - millions to be invested despite great uncertainty

    International Nuclear Information System (INIS)

    Beer, G.

    2005-01-01

    A directive passed by Brussels which directs Europe Union (EU) members to replace traditional fuels has created problems for many countries as they are not yet ready for bio-fuels. The directive counts with most euro-citizens no longer using pure petrol or diesel as of next year. Most refineries and petrol stations will have to sell a mixture of petrol and alcohol, or diesel and MERO. From 2007, bio-elements should comprise up to 5.75% of the energy content of diesel and petrol. The content of the bio-elements should be gradually increased to reach this figure - by the end of this year the required level will be 2%. For EU members, bio-fuels will create major problems and few advantages. Their share of car fuels will still be too low to have a major environmental effect or decrease dependency on oil imports. Reaching the prescribed percentage of bio-components in fuels will be expensive for the state. Exact figures are not yet available, but according to the National Program of Bio-Fuel Development this process will cost Slovakia over 500 mil Slovak crowns (Sk) (13.158 mil. Eur) in 2007 and by 2010 total state budget contributions will double. EC Directive 2003/30/EC creates business opportunities for certain business groups. But to benefit from this development they will have to act fast. In 2010, 29,000 ha. of maize and a greater acreage of grain will be needed for the production of the required volumes of bio-ethanol and so farmers have a chance to benefit from this situation. But farmers still do not have a clear view of what their cooperation with refineries will be like. In Slovakia, bio-alcohol will be produced from maize or grain. Its price is currently around 100 euro (4 000 Sk) per ton. To produce 1 ton of alcohol, 3 tons of grain are needed. A faster solution for Slovakia could be mixing diesel with MERO as in this area sufficient production capacity already exists, currently a part of production is exported to Germany, according to the head of Palma-Tumys, I. Beblavy. In his opinion, the current production volume of MERO is sufficient to cover Slovak's bio-fuel requirements

  4. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  5. Ecosystem Services Mapping Uncertainty Assessment: A Case Study in the Fitzroy Basin Mining Region

    Directory of Open Access Journals (Sweden)

    Zhenyu Wang

    2018-01-01

    Full Text Available Ecosystem services mapping is becoming increasingly popular through the use of various readily available mapping tools, however, uncertainties in assessment outputs are commonly ignored. Uncertainties from different sources have the potential to lower the accuracy of mapping outputs and reduce their reliability for decision-making. Using a case study in an Australian mining region, this paper assessed the impact of uncertainties on the modelling of the hydrological ecosystem service, water provision. Three types of uncertainty were modelled using multiple uncertainty scenarios: (1 spatial data sources; (2 modelling scales (temporal and spatial and (3 parameterization and model selection. We found that the mapping scales can induce significant changes to the spatial pattern of outputs and annual totals of water provision. In addition, differences in parameterization using differing sources from the literature also led to obvious differences in base flow. However, the impact of each uncertainty associated with differences in spatial data sources were not so great. The results of this study demonstrate the importance of uncertainty assessment and highlight that any conclusions drawn from ecosystem services mapping, such as the impacts of mining, are likely to also be a property of the uncertainty in ecosystem services mapping methods.

  6. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  7. Reducing Uncertainties in the Production of the Gamma-emitting Nuclei {sup 26}Al, {sup 44}Ti, and {sup 60}Fe in Core-collapse Supernovae by Using Effective Helium Burning Rates

    Energy Technology Data Exchange (ETDEWEB)

    Austin, Sam M. [National Superconducting Cyclotron Laboratory, Michigan State University, 640 South Shaw Lane, East Lansing, MI 48824-1321 (United States); West, Christopher; Heger, Alexander, E-mail: austin@nscl.msu.edu, E-mail: christopher.west@metrostate.edu, E-mail: Alexander.Heger@Monash.edu [Joint Institute for Nuclear Astrophysics—Center for the Evolution of the Elements, Michigan State University, East Lansing, MI 48824-1321 (United States)

    2017-04-10

    We have used effective reaction rates (ERRs) for the helium burning reactions to predict the yield of the gamma-emitting nuclei {sup 26}Al, {sup 44}Ti, and {sup 60}Fe in core-collapse supernovae (SNe). The variations in the predicted yields for values of the reaction rates allowed by the ERR are much smaller than obtained previously, and smaller than other uncertainties. A “filter” for SN nucleosynthesis yields based on pre-SN structure was used to estimate the effect of failed SNe on the initial mass function averaged yields; this substantially reduced the yields of all these isotopes, but the predicted yield ratio {sup 60}Fe/{sup 26}Al was little affected. The robustness of this ratio is promising for comparison with data, but it is larger than observed in nature; possible causes for this discrepancy are discussed.

  8. Uncertainty Characterization of Reactor Vessel Fracture Toughness

    International Nuclear Information System (INIS)

    Li, Fei; Modarres, Mohammad

    2002-01-01

    To perform fracture mechanics analysis of reactor vessel, fracture toughness (K Ic ) at various temperatures would be necessary. In a best estimate approach, K Ic uncertainties resulting from both lack of sufficient knowledge and randomness in some of the variables of K Ic must be characterized. Although it may be argued that there is only one type of uncertainty, which is lack of perfect knowledge about the subject under study, as a matter of practice K Ic uncertainties can be divided into two types: aleatory and epistemic. Aleatory uncertainty is related to uncertainty that is very difficult to reduce, if not impossible; epistemic uncertainty, on the other hand, can be practically reduced. Distinction between aleatory and epistemic uncertainties facilitates decision-making under uncertainty and allows for proper propagation of uncertainties in the computation process. Typically, epistemic uncertainties representing, for example, parameters of a model are sampled (to generate a 'snapshot', single-value of the parameters), but the totality of aleatory uncertainties is carried through the calculation as available. In this paper a description of an approach to account for these two types of uncertainties associated with K Ic has been provided. (authors)

  9. BN-600 MOX Core Benchmark Analysis. Results from Phases 4 and 6 of a Coordinated Research Project on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects

    International Nuclear Information System (INIS)

    2013-12-01

    For those Member States that have or have had significant fast reactor development programmes, it is of utmost importance that they have validated up to date codes and methods for fast reactor physics analysis in support of R and D and core design activities in the area of actinide utilization and incineration. In particular, some Member States have recently focused on fast reactor systems for minor actinide transmutation and on cores optimized for consuming rather than breeding plutonium; the physics of the breeder reactor cycle having already been widely investigated. Plutonium burning systems may have an important role in managing plutonium stocks until the time when major programmes of self-sufficient fast breeder reactors are established. For assessing the safety of these systems, it is important to determine the prediction accuracy of transient simulations and their associated reactivity coefficients. In response to Member States' expressed interest, the IAEA sponsored a coordinated research project (CRP) on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects. The CRP started in November 1999 and, at the first meeting, the members of the CRP endorsed a benchmark on the BN-600 hybrid core for consideration in its first studies. Benchmark analyses of the BN-600 hybrid core were performed during the first three phases of the CRP, investigating different nuclear data and levels of approximation in the calculation of safety related reactivity effects and their influence on uncertainties in transient analysis prediction. In an additional phase of the benchmark studies, experimental data were used for the verification and validation of nuclear data libraries and methods in support of the previous three phases. The results of phases 1, 2, 3 and 5 of the CRP are reported in IAEA-TECDOC-1623, BN-600 Hybrid Core Benchmark Analyses, Results from a Coordinated Research Project on Updated Codes and Methods to Reduce the

  10. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  11. Great Lakes Science Center

    Data.gov (United States)

    Federal Laboratory Consortium — Since 1927, Great Lakes Science Center (GLSC) research has provided critical information for the sound management of Great Lakes fish populations and other important...

  12. Measurement uncertainty: Friend or foe?

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  13. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...

  14. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  15. Proceedings of a workshop on dealing with uncertainties in the hydroelectric energy business. CD-ROM ed.

    International Nuclear Information System (INIS)

    2004-01-01

    This workshop was attended by experts in Canadian and international hydroelectric utilities to exchange information on current practices and opportunities for improvement or future cooperation. The discussions focused on reducing the uncertainties associated with hydroelectric power production. Although significant improvements have been made in the efficiency, reliability and safety of hydroelectric power production, the sector is still challenged by the uncertainty of water supply which depends greatly on weather conditions. Energy markets pose another challenge to power producers in terms of energy supply, energy demand and energy prices. The workshop focused on 3 themes: (1) weather and hydrologic uncertainty, (2) market uncertainty, and (3) decision making models using uncertainty principles surrounding water resource planning and operation. The workshop featured 22 presentations of which 11 have been indexed separately for inclusion in this database. refs., tabs., figs

  16. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  17. Treatment of uncertainty in low-level waste performance assessment

    International Nuclear Information System (INIS)

    Kozak, M.W.; Olague, N.E.; Gallegos, D.P.; Rao, R.R.

    1991-01-01

    Uncertainties arise from a number of different sources in low-level waste performance assessment. In this paper the types of uncertainty are reviewed, and existing methods for quantifying and reducing each type of uncertainty are discussed. These approaches are examined in the context of the current low-level radioactive waste regulatory performance objectives, which are deterministic. The types of uncertainty discussed in this paper are model uncertainty, uncertainty about future conditions, and parameter uncertainty. The advantages and disadvantages of available methods for addressing uncertainty in low-level waste performance assessment are presented. 25 refs

  18. Reducing the uncertainties in particle therapy

    Energy Technology Data Exchange (ETDEWEB)

    Oancea, C., E-mail: oancea@jinr.ru [Laboratory of Nuclear Problems, Joint Institute for Nuclear Research, 6 Joliot-Curie Street, 141980, Dubna, Moscow Region, Russia and Faculty of Physics, University of Bucharest, Atomistilor Street 405, 077125 Bucharest-Magurele, Ilfov Region (Romania); Shipulin, K. N.; Mytsin, G. V.; Luchin, Y. I. [Laboratory of Nuclear Problems, Joint Institute for Nuclear Research, 6 Joliot-Curie Street, 141980, Dubna, Moscow Region (Russian Federation)

    2015-02-24

    The use of fundamental Nuclear Physics in Nuclear Medicine has a significant impact in the fight against cancer. Hadrontherapy is an innovative cancer radiotherapy method using nuclear particles (protons, neutrons and ions) for the treatment of early and advanced tumors. The main goal of proton therapy is to deliver high radiation doses to the tumor volume with minimal damage to healthy tissues and organs. The purpose of this work was to investigate the dosimetric errors in clinical proton therapy dose calculation due to the presence of metallic implants in the treatment plan, and to determine the impact of the errors. The results indicate that the errors introduced by the treatment planning systems are higher than 10% in the prediction of the dose at isocenter when the proton beam is passing directly through a metallic titanium alloy implant. In conclusion, we recommend that pencil-beam algorithms not be used when planning treatment for patients with titanium alloy implants, and to consider implementing methods to mitigate the effects of the implants.

  19. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  20. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    Science.gov (United States)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model

  1. Towards a different attitude to uncertainty

    Directory of Open Access Journals (Sweden)

    Guy Pe'er

    2014-10-01

    Full Text Available The ecological literature deals with uncertainty primarily from the perspective of how to reduce it to acceptable levels. However, the current rapid and ubiquitous environmental changes, as well as anticipated rates of change, pose novel conditions and complex dynamics due to which many sources of uncertainty are difficult or even impossible to reduce. These include both uncertainty in knowledge (epistemic uncertainty and societal responses to it. Under these conditions, an increasing number of studies ask how one can deal with uncertainty as it is. Here, we explore the question how to adopt an overall alternative attitude to uncertainty, which accepts or even embraces it. First, we show that seeking to reduce uncertainty may be counterproductive under some circumstances. It may yield overconfidence, ignoring early warning signs, policy- and societal stagnation, or irresponsible behaviour if personal certainty is offered by externalization of environmental costs. We then demonstrate that uncertainty can have positive impacts by driving improvements in knowledge, promoting cautious action, contributing to keeping societies flexible and adaptable, enhancing awareness, support and involvement of the public in nature conservation, and enhancing cooperation and communication. We discuss the risks of employing a certainty paradigm on uncertain knowledge, the potential benefits of adopting an alternative attitude to uncertainty, and the need to implement such an attitude across scales – from adaptive management at the local scale, to the evolving Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES at the global level.

  2. An advanced joint inversion system for CO2 storage modeling with large date sets for characterization and real-time monitoring-enhancing storage performance and reducing failure risks under uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kitanidis, Peter [Stanford Univ., CA (United States)

    2016-04-30

    As large-scale, commercial storage projects become operational, the problem of utilizing information from diverse sources becomes more critically important. In this project, we developed, tested, and applied an advanced joint data inversion system for CO2 storage modeling with large data sets for use in site characterization and real-time monitoring. Emphasis was on the development of advanced and efficient computational algorithms for joint inversion of hydro-geophysical data, coupled with state-of-the-art forward process simulations. The developed system consists of (1) inversion tools using characterization data, such as 3D seismic survey (amplitude images), borehole log and core data, as well as hydraulic, tracer and thermal tests before CO2 injection, (2) joint inversion tools for updating the geologic model with the distribution of rock properties, thus reducing uncertainty, using hydro-geophysical monitoring data, and (3) highly efficient algorithms for directly solving the dense or sparse linear algebra systems derived from the joint inversion. The system combines methods from stochastic analysis, fast linear algebra, and high performance computing. The developed joint inversion tools have been tested through synthetic CO2 storage examples.

  3. Incorporating outcome uncertainty and prior outcome beliefs in stated preferences

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Jacobsen, Jette Bredahl; Hanley, Nick

    2015-01-01

    Stated preference studies tell respondents that policies create environmental changes with varying levels of uncertainty. However, respondents may include their own a priori assessments of uncertainty when making choices among policy options. Using a choice experiment eliciting respondents......’ preferences for conservation policies under climate change, we find that higher outcome uncertainty reduces utility. When accounting for endogeneity, we find that prior beliefs play a significant role in this cost of uncertainty. Thus, merely stating “objective” levels of outcome uncertainty...

  4. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  5. Pandemic influenza: certain uncertainties

    Science.gov (United States)

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  6. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  7. Visualizing Summary Statistics and Uncertainty

    KAUST Repository

    Potter, K.; Kniss, J.; Riesenfeld, R.; Johnson, C.R.

    2010-01-01

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  8. Section summary: Uncertainty and design considerations

    Science.gov (United States)

    Stephen Hagen

    2013-01-01

    Well planned sampling designs and robust approaches to estimating uncertainty are critical components of forest monitoring. The importance of uncertainty estimation increases as deforestation and degradation issues become more closely tied to financing incentives for reducing greenhouse gas emissions in the forest sector. Investors like to know risk and risk is tightly...

  9. Simulation codes and the impact of validation/uncertainty requirements

    International Nuclear Information System (INIS)

    Sills, H.E.

    1995-01-01

    Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)

  10. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  11. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  12. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  13. Best Practices of Uncertainty Estimation for the National Solar Radiation Database (NSRDB 1998-2015): Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-12-19

    It is essential to apply a traceable and standard approach to determine the uncertainty of solar resource data. Solar resource data are used for all phases of solar energy conversion projects, from the conceptual phase to routine solar power plant operation, and to determine performance guarantees of solar energy conversion systems. These guarantees are based on the available solar resource derived from a measurement station or modeled data set such as the National Solar Radiation Database (NSRDB). Therefore, quantifying the uncertainty of these data sets provides confidence to financiers, developers, and site operators of solar energy conversion systems and ultimately reduces deployment costs. In this study, we implemented the Guide to the Expression of Uncertainty in Measurement (GUM) 1 to quantify the overall uncertainty of the NSRDB data. First, we start with quantifying measurement uncertainty, then we determine each uncertainty statistic of the NSRDB data, and we combine them using the root-sum-of-the-squares method. The statistics were derived by comparing the NSRDB data to the seven measurement stations from the National Oceanic and Atmospheric Administration's Surface Radiation Budget Network, National Renewable Energy Laboratory's Solar Radiation Research Laboratory, and the Atmospheric Radiation Measurement program's Southern Great Plains Central Facility, in Billings, Oklahoma. The evaluation was conducted for hourly values, daily totals, monthly mean daily totals, and annual mean monthly mean daily totals. Varying time averages assist to capture the temporal uncertainty of the specific modeled solar resource data required for each phase of a solar energy project; some phases require higher temporal resolution than others. Overall, by including the uncertainty of measurements of solar radiation made at ground stations, bias, and root mean square error, the NSRDB data demonstrated expanded uncertainty of 17 percent - 29 percent on hourly

  14. Time-Varying Uncertainty in Shock and Vibration Applications Using the Impulse Response

    Directory of Open Access Journals (Sweden)

    J.B. Weathers

    2012-01-01

    Full Text Available Design of mechanical systems often necessitates the use of dynamic simulations to calculate the displacements (and their derivatives of the bodies in a system as a function of time in response to dynamic inputs. These types of simulations are especially prevalent in the shock and vibration community where simulations associated with models having complex inputs are routine. If the forcing functions as well as the parameters used in these simulations are subject to uncertainties, then these uncertainties will propagate through the models resulting in uncertainties in the outputs of interest. The uncertainty analysis procedure for these kinds of time-varying problems can be challenging, and in many instances, explicit data reduction equations (DRE's, i.e., analytical formulas, are not available because the outputs of interest are obtained from complex simulation software, e.g. FEA programs. Moreover, uncertainty propagation in systems modeled using nonlinear differential equations can prove to be difficult to analyze. However, if (1 the uncertainties propagate through the models in a linear manner, obeying the principle of superposition, then the complexity of the problem can be significantly simplified. If in addition, (2 the uncertainty in the model parameters do not change during the simulation and the manner in which the outputs of interest respond to small perturbations in the external input forces is not dependent on when the perturbations are applied, then the number of calculations required can be greatly reduced. Conditions (1 and (2 characterize a Linear Time Invariant (LTI uncertainty model. This paper seeks to explain one possible approach to obtain the uncertainty results based on these assumptions.

  15. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    uncertainties in CL and EX estimates were found to be efficiently mitigated by reducing data uncertainty in the critical limit of the chemical criteria - the BC/Al ratio. The distributed CL and EX assessment on local level in Sweden was found to be efficiently improved by enhancing the resolution of the underlying vegetation map 68 refs, 15 figs, 3 tabs

  16. Great Lakes Literacy Principles

    Science.gov (United States)

    Fortner, Rosanne W.; Manzo, Lyndsey

    2011-03-01

    Lakes Superior, Huron, Michigan, Ontario, and Erie together form North America's Great Lakes, a region that contains 20% of the world's fresh surface water and is home to roughly one quarter of the U.S. population (Figure 1). Supporting a $4 billion sport fishing industry, plus $16 billion annually in boating, 1.5 million U.S. jobs, and $62 billion in annual wages directly, the Great Lakes form the backbone of a regional economy that is vital to the United States as a whole (see http://www.miseagrant.umich.edu/downloads/economy/11-708-Great-Lakes-Jobs.pdf). Yet the grandeur and importance of this freshwater resource are little understood, not only by people in the rest of the country but also by many in the region itself. To help address this lack of knowledge, the Centers for Ocean Sciences Education Excellence (COSEE) Great Lakes, supported by the U.S. National Science Foundation and the National Oceanic and Atmospheric Administration, developed literacy principles for the Great Lakes to serve as a guide for education of students and the public. These “Great Lakes Literacy Principles” represent an understanding of the Great Lakes' influences on society and society's influences on the Great Lakes.

  17. The Next Great Generation?

    Science.gov (United States)

    Brownstein, Andrew

    2000-01-01

    Discusses ideas from a new book, "Millennials Rising: The Next Great Generation," (by Neil Howe and William Strauss) suggesting that youth culture is on the cusp of a radical shift with the generation beginning with this year's college freshmen who are typically team oriented, optimistic, and poised for greatness on a global scale. Includes a…

  18. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  19. Uncertainty Quantification with Applications to Engineering Problems

    DEFF Research Database (Denmark)

    Bigoni, Daniele

    in measurements, predictions and manufacturing, and we can say that any dynamical system used in engineering is subject to some of these uncertainties. The first part of this work presents an overview of the mathematical framework used in Uncertainty Quantification (UQ) analysis and introduces the spectral tensor...... and thus the UQ analysis of the associated systems will benefit greatly from the application of methods which require few function evaluations. We first consider the propagation of the uncertainty and the sensitivity analysis of the non-linear dynamics of railway vehicles with suspension components whose......-scale problems, where efficient methods are necessary with today’s computational resources. The outcome of this work was also the creation of several freely available Python modules for Uncertainty Quantification, which are listed and described in the appendix....

  20. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  1. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  2. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  3. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  4. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    Science.gov (United States)

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  5. On treatment of uncertainty in system planning

    International Nuclear Information System (INIS)

    Flage, R.; Aven, T.

    2009-01-01

    In system planning and operation considerable efforts and resources are spent to reduce uncertainties, as a part of project management, uncertainty management and safety management. The basic idea seems to be that uncertainties are purely negative and should be reduced. In this paper we challenge this way of thinking, using a common industry practice as an example. In accordance with this industry practice, three uncertainty interval categories are used: ±40% intervals for the feasibility phase, ±30% intervals for the concept development phase and ±20% intervals for the engineering phase. The problem is that such a regime could easily lead to a conservative management regime encouraging the use of existing methods and tools, as new activities and novel solutions and arrangements necessarily mean increased uncertainties. In the paper we suggest an alternative approach based on uncertainty and risk descriptions, but having no predefined uncertainty reduction structures. The approach makes use of risk assessments and economic optimisation tools such as the expected net present value, but acknowledges the need for broad risk management processes which extend beyond the analyses. Different concerns need to be balanced, including economic aspects, uncertainties and risk, and practicability

  6. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  7. [Dealing with diagnostic uncertainty in general practice].

    Science.gov (United States)

    Wübken, Magdalena; Oswald, Jana; Schneider, Antonius

    2013-01-01

    In general, the prevalence of diseases is low in primary care. Therefore, the positive predictive value of diagnostic tests is lower than in hospitals where patients are highly selected. In addition, the patients present with milder forms of disease; and many diseases might hide behind the initial symptom(s). These facts lead to diagnostic uncertainty which is somewhat inherent to general practice. This narrative review discusses different sources of and reasons for uncertainty and strategies to deal with it in the context of the current literature. Fear of uncertainty correlates with higher diagnostic activities. The attitude towards uncertainty correlates with the choice of medical speciality by vocational trainees or medical students. An intolerance of uncertainty, which still increases as medicine is making steady progress, might partly explain the growing shortage of general practitioners. The bio-psycho-social context appears to be important to diagnostic decision-making. The effect of intuition and heuristics are investigated by cognitive psychologists. It is still unclear whether these aspects are prone to bias or useful, which might depend on the context of medical decisions. Good communication is of great importance to share uncertainty with the patients in a transparent way and to alleviate shared decision-making. Dealing with uncertainty should be seen as an important core component of general practice and needs to be investigated in more detail to improve the respective medical decisions. Copyright © 2013. Published by Elsevier GmbH.

  8. Great Indoors Awards 2007

    Index Scriptorium Estoniae

    2007-01-01

    Hollandis Maastrichtis jagati 17. XI esimest korda rahvusvahelist auhinda The Great Indoors Award. Aasta sisekujundusfirmaks valiti Masamichi Katayama asutatud Wonderwall. Auhinna said veel Zaha Hadid, Heatherwick Studio, Ryui Nakamura Architects ja Item Idem

  9. Great Lakes Bathymetry

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Bathymetry of Lakes Michigan, Erie, Saint Clair, Ontario and Huron has been compiled as a component of a NOAA project to rescue Great Lakes lake floor geological and...

  10. Prototyping and Testing a New Volumetric Curvature Tool for Modeling Reservoir Compartments and Leakage Pathways in the Arbuckle Saline Aquifer: Reducing Uncertainty in CO2 Storage and Permanence

    Energy Technology Data Exchange (ETDEWEB)

    Rush, Jason [Univ. of Kansas and Kansas Geological Survey, Lawrence, KS (United States); Holubnyak, Yevhen [Univ. of Kansas and Kansas Geological Survey, Lawrence, KS (United States); Watney, Willard [Univ. of Kansas and Kansas Geological Survey, Lawrence, KS (United States)

    2016-12-09

    This DOE-funded project evaluates the utility of seismic volumetric curvature (VC) for predicting stratal and structural architecture diagnostic of paleokarst reservoirs. Of special interest are applications geared toward carbon capture, utilization, and storage (CCUS). VC has been championed for identifying faults (offset <¼ λ) that cannot be imaged by conventional 3-D seismic attributes such as coherence. The objective of this research was to evaluate VC-techniques for reducing uncertainties in reservoir compartmentalization studies and seal risk assessments especially for saline aquifers. A 2000-ft horizontal lateral was purposefully drilled across VC-imaged lineaments—interpreted to record a fractured and a fault-bounded doline—to physically confirm their presence. The 15-mi² study area is located in southeastern Bemis-Shutts Field, which is situated along the crest of the Central Kansas Uplift (CKU) in Ellis County, Kansas. The uppermost Arbuckle (200+ ft) has extensive paleokarst including collapsed paleocaverns and dolines related to exceedingly prolonged pre-Simpson (Sauk–Tippecanoe) and/or pre-Pennsylvanian subaerial exposure. A lateral borehole was successfully drilled across the full extent (~1100 ft) of a VC-inferred paleokarst doline. Triple combo (GR-neutron/density-resistivity), full-wave sonic, and borehole micro-imager logs were successfully run to TD on drill-pipe. Results from the formation evaluation reveal breccias (e.g., crackle, mosaic, chaotic), fractures, faults, vugs (1-6"), and unaffected host strata consistent with the pre-spud interpretation. Well-rounded pebbles were also observed on the image log. VC-inferred lineaments coincide with 20–80-ft wide intervals of high GR values (100+ API), matrix-rich breccias, and faults. To further demonstrate their utility, VC attributes are integrated into a geocellular modeling workflow: 1) to constrain the structural model; 2) to generate facies probability grids, and; 3) to collocate

  11. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    Science.gov (United States)

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  12. Uncertainty in social dilemmas

    NARCIS (Netherlands)

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size

  13. Uncertainty and Climate Change

    OpenAIRE

    Berliner, L. Mark

    2003-01-01

    Anthropogenic, or human-induced, climate change is a critical issue in science and in the affairs of humankind. Though the target of substantial research, the conclusions of climate change studies remain subject to numerous uncertainties. This article presents a very brief review of the basic arguments regarding anthropogenic climate change with particular emphasis on uncertainty.

  14. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  15. Evaluation of Sources of Uncertainties in Solar Resource Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-25

    This poster presents a high-level overview of sources of uncertainties in solar resource measurement, demonstrating the impact of various sources of uncertainties -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of data from several radiometers. The study provides insight on how to reduce the impact of some of the sources of uncertainties.

  16. Uncertainty vs. Information (Invited)

    Science.gov (United States)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  17. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  18. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  19. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  20. Potential effects of organizational uncertainty on safety

    International Nuclear Information System (INIS)

    Durbin, N.E.; Lekberg, A.; Melber, B.D.

    2001-12-01

    When organizations face significant change - reorganization, mergers, acquisitions, down sizing, plant closures or decommissioning - both the organizations and the workers in those organizations experience significant uncertainty about the future. This uncertainty affects the organization and the people working in the organization - adversely affecting morale, reducing concentration on safe operations, and resulting in the loss of key staff. Hence, organizations, particularly those using high risk technologies, which are facing significant change need to consider and plan for the effects of organizational uncertainty on safety - as well as planning for other consequences of change - technical, economic, emotional, and productivity related. This paper reviews some of what is known about the effects of uncertainty on organizations and individuals, discusses the potential consequences of uncertainty on organizational and individual behavior, and presents some of the implications for safety professionals

  1. Potential effects of organizational uncertainty on safety

    Energy Technology Data Exchange (ETDEWEB)

    Durbin, N.E. [MPD Consulting Group, Kirkland, WA (United States); Lekberg, A. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Melber, B.D. [Melber Consulting, Seattle WA (United States)

    2001-12-01

    When organizations face significant change - reorganization, mergers, acquisitions, down sizing, plant closures or decommissioning - both the organizations and the workers in those organizations experience significant uncertainty about the future. This uncertainty affects the organization and the people working in the organization - adversely affecting morale, reducing concentration on safe operations, and resulting in the loss of key staff. Hence, organizations, particularly those using high risk technologies, which are facing significant change need to consider and plan for the effects of organizational uncertainty on safety - as well as planning for other consequences of change - technical, economic, emotional, and productivity related. This paper reviews some of what is known about the effects of uncertainty on organizations and individuals, discusses the potential consequences of uncertainty on organizational and individual behavior, and presents some of the implications for safety professionals.

  2. Uncertainty quantification of surface-water/groundwater exchange estimates in large wetland systems using Python

    Science.gov (United States)

    Hughes, J. D.; Metz, P. A.

    2014-12-01

    Most watershed studies include observation-based water budget analyses to develop first-order estimates of significant flow terms. Surface-water/groundwater (SWGW) exchange is typically assumed to be equal to the residual of the sum of inflows and outflows in a watershed. These estimates of SWGW exchange, however, are highly uncertain as a result of the propagation of uncertainty inherent in the calculation or processing of the other terms of the water budget, such as stage-area-volume relations, and uncertainties associated with land-cover based evapotranspiration (ET) rate estimates. Furthermore, the uncertainty of estimated SWGW exchanges can be magnified in large wetland systems that transition from dry to wet during wet periods. Although it is well understood that observation-based estimates of SWGW exchange are uncertain it is uncommon for the uncertainty of these estimates to be directly quantified. High-level programming languages like Python can greatly reduce the effort required to (1) quantify the uncertainty of estimated SWGW exchange in large wetland systems and (2) evaluate how different approaches for partitioning land-cover data in a watershed may affect the water-budget uncertainty. We have used Python with the Numpy, Scipy.stats, and pyDOE packages to implement an unconstrained Monte Carlo approach with Latin Hypercube sampling to quantify the uncertainty of monthly estimates of SWGW exchange in the Floral City watershed of the Tsala Apopka wetland system in west-central Florida, USA. Possible sources of uncertainty in the water budget analysis include rainfall, ET, canal discharge, and land/bathymetric surface elevations. Each of these input variables was assigned a probability distribution based on observation error or spanning the range of probable values. The Monte Carlo integration process exposes the uncertainties in land-cover based ET rate estimates as the dominant contributor to the uncertainty in SWGW exchange estimates. We will discuss

  3. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  4. The GREAT3 challenge

    International Nuclear Information System (INIS)

    Miyatake, H; Mandelbaum, R; Rowe, B

    2014-01-01

    The GRavitational lEnsing Accuracy Testing 3 (GREAT3) challenge is an image analysis competition that aims to test algorithms to measure weak gravitational lensing from astronomical images. The challenge started in October 2013 and ends 30 April 2014. The challenge focuses on testing the impact on weak lensing measurements of realistically complex galaxy morphologies, realistic point spread function, and combination of multiple different exposures. It includes simulated ground- and space-based data. The details of the challenge are described in [1], and the challenge website and its leader board can be found at http://great3challenge.info and http://great3.projects.phys.ucl.ac.uk/leaderboard/, respectively

  5. Accounting for uncertainty in marine reserve design.

    Science.gov (United States)

    Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A

    2006-01-01

    Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.

  6. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  7. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....

  8. Robust nonlinear control of nuclear reactors under model uncertainty

    International Nuclear Information System (INIS)

    Park, Moon Ghu

    1993-02-01

    A nonlinear model-based control method is developed for the robust control of a nuclear reactor. The nonlinear plant model is used to design a unique control law which covers a wide operating range. The robustness is a crucial factor for the fully automatic control of reactor power due to time-varying, uncertain parameters, and state estimation error, or unmodeled dynamics. A variable structure control (VSC) method is introduced which consists of an adaptive performance specification (fime control) after the tracking error reaches the narrow boundary-layer by a time-optimal control (coarse control). Variable structure control is a powerful method for nonlinear system controller design which has inherent robustness to parameter variations or external disturbances using the known uncertainty bounds, and it requires very low computational efforts. In spite of its desirable properties, conventional VSC presents several important drawbacks that limit its practical applicability. One of the most undesirable phenomena is chattering, which implies extremely high control activity and may excite high-frequency unmodeled dynamics. This problem is due to the neglected actuator time-delay or sampling effects. The problem was partially remedied by replacing chattering control by a smooth control inter-polation in a boundary layer neighnboring a time-varying sliding surface. But, for the nuclear reactor systems which has very fast dynamic response, the sampling effect may destroy the narrow boundary layer when a large uncertainty bound is used. Due to the very short neutron life time, large uncertainty bound leads to the high gain in feedback control. To resolve this problem, a derivative feedback is introduced that gives excellent performance by reducing the uncertainty bound. The stability of tracking error dynamics is guaranteed by the second method of Lyapunov using the two-level uncertainty bounds that are obtained from the knowledge of uncertainty bound and the estimated

  9. Nothing Great Is Easy

    OpenAIRE

    Stansbie, Lisa

    2014-01-01

    A solo exhibition of 13 pieces of art work.\\ud \\ud Nothing Great is Easy is an exhibition of sculpture, film, drawing and photography that proposes reconstructed narratives using the sport of swimming and in particular the collective interaction and identity of the channel swimmer. The work utilises the processes, rituals/rules, language and the apparatus of sport.\\ud \\ud “Nothing great is easy” are the words on the memorial to Captain Matthew Webb who was the first man to swim the English ch...

  10. Statistical analysis of the uncertainty related to flood hazard appraisal

    Science.gov (United States)

    Notaro, Vincenza; Freni, Gabriele

    2015-12-01

    The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.

  11. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  12. An Innovative Approach to Effective Climate Science Application through Stakeholder Participation in Great Plains Grasslands

    Science.gov (United States)

    Athearn, N.; Broska, J.

    2015-12-01

    For natural resource managers and other Great Plains stakeholders, climate uncertainties further confound decision-making on a highly altered landscape. Partner organizations comprising the Great Plains Landscape Conservation Cooperative (GPLCC) acknowledge climate change as a high-priority threat to grasslands and associated habitats, affecting water availability, species composition, and other factors. Despite its importance, incorporation of climate change impacts into planning is hindered by high uncertainty and lack of translation to a tangible outcome: effects on species and their habitats. In 2014, the GPLCC initiated a Landscape Conservation Design (LCD) process to ultimately improve the size and connectivity of grasslands - informing land managers of the landscape-scale impacts of local decisions about where to restore, enhance, protect, and develop lands. Defining this goal helped stakeholders envision a tangible product. High resolution land cover data recently completed for Texas and Oklahoma represent current grassland locations. By focusing climate change models to project changes in these land cover datasets, resulting land cover projections can be directly incorporated into LCD-based models to focus restoration where future climates will support grasslands. Broad organizational cooperation has been critical for this USGS-led project, which uses downscaled climate data and other support from the South Central Climate Science Center Consortium and builds on existing work including LCD efforts of the Playa Lakes Joint Venture and the Bureau of Land Management's Southern Great Plains Rapid Ecological Assessment. Ongoing stakeholder guidance through an advisory team ensures effective application of a product that will be both relevant to and understood by decision makers, for whom the primary role of research is to reduce uncertainties and clear the path for more efficient decision-making in the face of climatic uncertainty.

  13. The Great Mathematician Project

    Science.gov (United States)

    Goldberg, Sabrina R.

    2013-01-01

    The Great Mathematician Project (GMP) introduces both mathematically sophisticated and struggling students to the history of mathematics. The rationale for the GMP is twofold: first, mathematics is a uniquely people-centered discipline that is used to make sense of the world; and second, students often express curiosity about the history of…

  14. Uncertainties associated with inertial-fusion ignition

    International Nuclear Information System (INIS)

    McCall, G.H.

    1981-01-01

    An estimate is made of a worst case driving energy which is derived from analytic and computer calculations. It will be shown that the uncertainty can be reduced by a factor of 10 to 100 if certain physical effects are understood. That is not to say that the energy requirement can necessarily be reduced below that of the worst case, but it is possible to reduce the uncertainty associated with ignition energy. With laser costs in the $0.5 to 1 billion per MJ range, it can be seen that such an exercise is worthwhile

  15. What great managers do.

    Science.gov (United States)

    Buckingham, Marcus

    2005-03-01

    Much has been written about the qualities that make a great manager, but most of the literature overlooks a fundamental question: What does a great manager actually do? While there are countless management styles, one thing underpins the behavior of all great managers. Above all, an exceptional manager comes to know and value the particular quirks and abilities of her employees. She figures out how to capitalize on her staffers' strengths and tweaks her environment to meet her larger goals. Such a specialized approach may seem like a lot of work. But in fact, capitalizing on each person's uniqueness can save time. Rather than encourage employees to conform to strict job descriptions that may include tasks they don't enjoy and aren't good at, a manager who develops positions for his staff members based on their unique abilities will be rewarded with behaviors that are far more efficient and effective than they would be otherwise. This focus on individuals also makes employees more accountable. Because staffers are evaluated on their particular strengths and weaknesses, they are challenged to take responsibility for their abilities and to hone them. Capitalizing on a person's uniqueness also builds a stronger sense of team. By taking the time to understand what makes each employee tick, a great manager shows that he sees his people for who they are. This personal investment not only motivates individuals but also galvanizes the entire team. Finally, this approach shakes up existing hierarchies, which leads to more creative thinking. To take great managing from theory to practice, the author says, you must know three things about a person: her strengths, the triggers that activate those strengths, and how she learns. By asking the right questions, squeezing the right triggers, and becoming aware of your employees' learning styles, you will discover what motivates each person to excel.

  16. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  17. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  18. Uncertainties and climatic change

    International Nuclear Information System (INIS)

    De Gier, A.M.; Opschoor, J.B.; Van de Donk, W.B.H.J.; Hooimeijer, P.; Jepma, J.; Lelieveld, J.; Oerlemans, J.; Petersen, A.

    2008-01-01

    Which processes in the climate system are misunderstood? How are scientists dealing with uncertainty about climate change? What will be done with the conclusions of the recently published synthesis report of the IPCC? These and other questions were answered during the meeting 'Uncertainties and climate change' that was held on Monday 26 November 2007 at the KNAW in Amsterdam. This report is a compilation of all the presentations and provides some conclusions resulting from the discussions during this meeting. [mk] [nl

  19. Mechanics and uncertainty

    CERN Document Server

    Lemaire, Maurice

    2014-01-01

    Science is a quest for certainty, but lack of certainty is the driving force behind all of its endeavors. This book, specifically, examines the uncertainty of technological and industrial science. Uncertainty and Mechanics studies the concepts of mechanical design in an uncertain setting and explains engineering techniques for inventing cost-effective products. Though it references practical applications, this is a book about ideas and potential advances in mechanical science.

  20. Uncertainty: lotteries and risk

    OpenAIRE

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  1. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  2. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  3. Justification for recommended uncertainties

    International Nuclear Information System (INIS)

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  4. Great magnetic storms

    International Nuclear Information System (INIS)

    Tsurutani, B.T.; Yen Te Lee; Tang, F.; Gonzalez, W.D.

    1992-01-01

    The five largest magnetic storms that occurred between 1971 and 1986 are studied to determine their solar and interplanetary causes. All of the events are found to be associated with high speed solar wind streams led by collisionless shocks. The high speed streams are clearly related to identifiable solar flares. It is found that (1) it is the extreme values of the southward interplanetary magnetic fields rather than solar wind speeds that are the primary causes of great magnetic storms, (2) shocked and draped sheath fields preceding the driver gas (magnetic cloud) are at least as effective in causing the onset of great magnetic storms (3 of 5 events ) as the strong fields within the driver gas itself, and (3) precursor southward fields ahead of the high speed streams allow the shock compression mechanism (item 2) to be particularly geoeffective

  5. The great intimidators.

    Science.gov (United States)

    Kramer, Roderick M

    2006-02-01

    After Disney's Michael Eisner, Miramax's Harvey Weinstein, and Hewlett-Packard's Carly Fiorina fell from their heights of power, the business media quickly proclaimed thatthe reign of abrasive, intimidating leaders was over. However, it's premature to proclaim their extinction. Many great intimidators have done fine for a long time and continue to thrive. Their modus operandi runs counter to a lot of preconceptions about what it takes to be a good leader. They're rough, loud, and in your face. Their tactics include invading others' personal space, staging tantrums, keeping people guessing, and possessing an indisputable command of facts. But make no mistake--great intimidators are not your typical bullies. They're driven by vision, not by sheer ego or malice. Beneath their tough exteriors and sharp edges are some genuine, deep insights into human motivation and organizational behavior. Indeed, these leaders possess political intelligence, which can make the difference between paralysis and successful--if sometimes wrenching--organizational change. Like socially intelligent leaders, politically intelligent leaders are adept at sizing up others, but they notice different things. Those with social intelligence assess people's strengths and figure out how to leverage them; those with political intelligence exploit people's weaknesses and insecurities. Despite all the obvious drawbacks of working under them, great intimidators often attract the best and brightest. And their appeal goes beyond their ability to inspire high performance. Many accomplished professionals who gravitate toward these leaders want to cultivate a little "inner intimidator" of their own. In the author's research, quite a few individuals reported having positive relationships with intimidating leaders. In fact, some described these relationships as profoundly educational and even transformational. So before we throw out all the great intimidators, the author argues, we should stop to consider what

  6. Great Lakes Energy Institute

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, J. Iwan [Case Western Reserve Univ., Cleveland, OH (United States)

    2012-11-18

    The vision of the Great Lakes Energy Institute is to enable the transition to advanced, sustainable energy generation, storage, distribution and utilization through coordinated research, development, and education. The Institute will place emphasis on translating leading edge research into next generation energy technology. The Institute’s research thrusts focus on coordinated research in decentralized power generation devices (e.g. fuel cells, wind turbines, solar photovoltaic devices), management of electrical power transmission and distribution, energy storage, and energy efficiency.

  7. Understanding uncertainties in non-linear population trajectories: a Bayesian semi-parametric hierarchical approach to large-scale surveys of coral cover.

    Directory of Open Access Journals (Sweden)

    Julie Vercelloni

    Full Text Available Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making.

  8. Medical Need, Equality, and Uncertainty.

    Science.gov (United States)

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality. © 2016 John Wiley & Sons Ltd.

  9. Uncertainty and global climate change research

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E. [Oak Ridge National Lab., TN (United States); Weiher, R. [National Oceanic and Atmospheric Administration, Boulder, CO (United States)

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  10. Decisions under uncertainty using Bayesian analysis

    Directory of Open Access Journals (Sweden)

    Stelian STANCU

    2006-01-01

    Full Text Available The present paper makes a short presentation of the Bayesian decions method, where extrainformation brings a great support to decision making process, but also attract new costs. In this situation, getting new information, generally experimentaly based, contributes to diminushing the uncertainty degree that influences decision making process. As a conclusion, in a large number of decision problems, there is the possibility that the decision makers will renew some decisions already taken because of the facilities offered by obtainig extrainformation.

  11. Projected uranium measurement uncertainties for the Gas Centrifuge Enrichment Plant

    International Nuclear Information System (INIS)

    Younkin, J.M.

    1979-02-01

    An analysis was made of the uncertainties associated with the measurements of the declared uranium streams in the Portsmouth Gas Centrifuge Enrichment Plant (GCEP). The total uncertainty for the GCEP is projected to be from 54 to 108 kg 235 U/year out of a measured total of 200,000 kg 235 U/year. The systematic component of uncertainty of the UF 6 streams is the largest and the dominant contributor to the total uncertainty. A possible scheme for reducing the total uncertainty is given

  12. The uncertainty analysis of model results a practical guide

    CERN Document Server

    Hofer, Eduard

    2018-01-01

    This book is a practical guide to the uncertainty analysis of computer model applications. Used in many areas, such as engineering, ecology and economics, computer models are subject to various uncertainties at the level of model formulations, parameter values and input data. Naturally, it would be advantageous to know the combined effect of these uncertainties on the model results as well as whether the state of knowledge should be improved in order to reduce the uncertainty of the results most effectively. The book supports decision-makers, model developers and users in their argumentation for an uncertainty analysis and assists them in the interpretation of the analysis results.

  13. Policy Uncertainty and the US Ethanol Industry

    Directory of Open Access Journals (Sweden)

    Jason P. H. Jones

    2017-11-01

    Full Text Available The Renewable Fuel Standard (RFS2, as implemented, has introduced uncertainty into US ethanol producers and the supporting commodity market. First, the fixed mandate for what is mainly cornstarch-based ethanol has increased feedstock price volatility and exerts a general effect across the agricultural sector. Second, the large discrepancy between the original Energy Independence and Security Act (EISA intentions and the actual RFS2 implementation for some fuel classes has increased the investment uncertainty facing investors in biofuel production, distribution, and consumption. Here we discuss and analyze the sources of uncertainty and evaluate the effect of potential RFS2 adjustments as they influence these uncertainties. This includes the use of a flexible, production dependent mandate on corn starch ethanol. We find that a flexible mandate on cornstarch ethanol relaxed during drought could significantly reduce commodity price spikes and alleviate the decline of livestock production in cases of feedstock production shortfalls, but it would increase the risk for ethanol investors.

  14. Idiopathic great saphenous phlebosclerosis.

    Directory of Open Access Journals (Sweden)

    Ahmadreza Jodati

    2013-06-01

    Full Text Available Arterial sclerosis has been extensively described but reports on venous sclerosis are very sparse. Phlebosclerosis refers to the thickening and hardening of the venous wall. Despite its morphological similarities with arteriosclerosis and potential morbid consequences, phlebosclerosis has gained only little attention. We report a 72 year old male with paralysis and atrophy of the right leg due to childhood poliomyelitis who was referred for coronary artery bypass surgery. The great saphenous vein, harvested from the left leg, showed a hardened cord-like obliterated vein. Surprisingly, harvested veins from the atrophic limb were normal and successfully used for grafting.

  15. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  16. Making Psychotherapy Great Again?

    Science.gov (United States)

    Plakun, Eric M

    2017-05-01

    Psychotherapy never stopped being as "great" as other treatments. This column explores the evidence base for both psychotherapy and medications, using depression as a specific example. The limitations are comparable for psychotherapy and medication, with much of the evidence based on small degrees of "statistically significant" rather than "clinically meaningful" change. Our field's biomedical emphasis leads to a false assumption that most patients present with single disorders, when comorbidity is the rule rather than the exception. This false assumption contributes to limitations in the evidence base and in our ability to treat patients optimally.

  17. Dealing with exploration uncertainties

    International Nuclear Information System (INIS)

    Capen, E.

    1992-01-01

    Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side

  18. Uncertainty in artificial intelligence

    CERN Document Server

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  19. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  20. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    Science.gov (United States)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate

  1. How risk and uncertainty is used in Supply Chain Management: a literature study

    DEFF Research Database (Denmark)

    Bøge Sørensen, Lars

    2004-01-01

    Keywords Supply Chain Management, Risk Management, Supply Chain Risk ManagementAbstract To comply with Supply Chain Management dogma companies have cut their inventoriesto a minimum, lead times have been shortened, new suppliers have been chosen and the customerportfolio has been reduced. All...... of these activities impose a great deal of risk on the firms,jeopardizing the survival of entire supply chains. In this article the author intends to investigateand document the use and meaning of Risk and Uncertainty within journals publishing material onSupply Chain Management and Logistics. Subsequently...... suggestions for further research areproposed - the integration of Risk Management into the discipline of Supply Chain Design....

  2. Climate change decision-making: Model & parameter uncertainties explored

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.; Linville, C.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.

  3. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  4. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  5. Risks, uncertainty, vagueness

    International Nuclear Information System (INIS)

    Haefele, W.; Renn, O.; Erdmann, G.

    1990-01-01

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de

  6. Great Britain at CERN

    CERN Multimedia

    2006-01-01

    From 14 to 16 November 2006 Administration Building, Bldg. 60/61 - ground and 1st floor 09.30 - 17.30 Fifteen companies will present their latest technologies at the 'Great Britain at CERN' exhibition. British industry will exhibit products and technologies related to the field of particle physics. The main fields represented will be computing technologies, electrical engineering, electronics, mechanical engineering, vacuum & low temperature technologies and particle detectors. The exhibition is organised by BEAMA Exhibitions (the British Electrotechnical and Allied Manufacturers Association). Below you will find: a list of the exhibitors. A detailed programme will be available in due course: from your Departmental secretariat, from the Reception information desk, Building 33, at the exhibition itself. A detailed list of the companies is available at the following FI link: http://fi-dep.web.cern.ch/fi-dep/structure/memberstates/exhibitions_visits.htm LIST OF EXHIBITORS 3D Metrics Almat...

  7. Great Britain at CERN

    CERN Multimedia

    2006-01-01

    From 14 to 16 November 2006 Administration Building, Bldg. 60/61 - ground and 1st floor 09.30 - 17.30 Fifteen companies will present their latest technologies at the 'Great Britain at CERN' exhibition. British industry will exhibit products and technologies related to the field of particle physics. The main fields represented will be computing technologies, electrical engineering, electronics, mechanical engineering, vacuum & low temperature technologies and particle detectors. The exhibition is organised by BEAMA Exhibitions (the British Electrotechnical and Allied Manufacturers Association). Below you will find: a list of the exhibitors. A detailed programme will be available in due course: from your Departmental secretariat, from the Reception information desk, Building 33, at the exhibition itself. A detailed list of the companies is available at the following FI link: http://fi-dep.web.cern.ch/fi-dep/structure/memberstates/exhibitions_visits.htm LIST OF EXHIBITORS 3D Metrics Alma...

  8. Strategy under uncertainty.

    Science.gov (United States)

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  9. Asphere cross testing: an exercise in uncertainty estimation

    Science.gov (United States)

    Murphy, Paul E.

    2017-10-01

    Aspheric surfaces can provide substantial improvements to optical designs, but they can also be difficult to manufacture cost-effectively. Asphere metrology contributes significantly to this difficulty, especially for high-precision aspheric surfaces. With the advent of computer-controlled fabrication machinery, optical surface quality is chiefly limited by the ability to measure it. Consequently, understanding the uncertainty of surface measurements is of great importance for determining what optical surface quality can be achieved. We measured sample aspheres using multiple techniques: profilometry, null interferometry, and subaperture stitching. We also obtained repeatability and reproducibility (R&R) measurement data by retesting the same aspheres under various conditions. We highlight some of the details associated with the different measurement techniques, especially efforts to reduce bias in the null tests via calibration. We compare and contrast the measurement results, and obtain an empirical view of the measurement uncertainty of the different techniques. We found fair agreement in overall surface form among the methods, but meaningful differences in reproducibility and mid-spatial frequency performance.

  10. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    Science.gov (United States)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  11. Development of Evaluation Code for MUF Uncertainty

    International Nuclear Information System (INIS)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  12. Development of Evaluation Code for MUF Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  13. Optimization Under Uncertainty for Wake Steering Strategies

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ning, Andrew [Brigham Young University

    2017-08-03

    Offsetting turbines' yaw orientations from incoming wind is a powerful tool that may be leveraged to reduce undesirable wake effects on downstream turbines. First, we examine a simple two-turbine case to gain intuition as to how inflow direction uncertainty affects the optimal solution. The turbines are modeled with unidirectional inflow such that one turbine directly wakes the other, using ten rotor diameter spacing. We perform optimization under uncertainty (OUU) via a parameter sweep of the front turbine. The OUU solution generally prefers less steering. We then do this optimization for a 60-turbine wind farm with unidirectional inflow, varying the degree of inflow uncertainty and approaching this OUU problem by nesting a polynomial chaos expansion uncertainty quantification routine within an outer optimization. We examined how different levels of uncertainty in the inflow direction effect the ratio of the expected values of deterministic and OUU solutions for steering strategies in the large wind farm, assuming the directional uncertainty used to reach said OUU solution (this ratio is defined as the value of the stochastic solution or VSS).

  14. Pacific salmonines in the Great Lakes Basin

    Science.gov (United States)

    Claramunt, Randall M.; Madenjian, Charles P.; Clapp, David; Taylor, William W.; Lynch, Abigail J.; Léonard, Nancy J.

    2012-01-01

    Pacific salmon (genus Oncorhynchus) are a valuable resource, both within their native range in the North Pacific rim and in the Great Lakes basin. Understanding their value from a biological and economic perspective in the Great Lakes, however, requires an understanding of changes in the ecosystem and of management actions that have been taken to promote system stability, integrity, and sustainable fisheries. Pacific salmonine introductions to the Great Lakes are comprised mainly of Chinook salmon, coho salmon, and steelhead and have accounted for 421, 177, and 247 million fish, respectively, stocked during 1966-2007. Stocking of Pacific salmonines has been effective in substantially reducing exotic prey fish abundances in several of the Great Lakes (e.g., lakes Michigan, Huron, and Ontario). The goal of our evaluation was to highlight differences in management strategies and perspectives across the basin, and to evaluate policies for Pacific salmonine management in the Great Lakes. Currently, a potential conflict exists between Pacific salmonine management and native fish rehabilitation goals because of the desire to sustain recreational fisheries and to develop self-sustaining populations of stocked Pacific salmonines in the Great Lakes. We provide evidence that suggests Pacific salmonines have not only become naturalized to the food webs of the Great Lakes, but that their populations (specifically Chinook salmon) may be fluctuating in concert with specific prey (i.e., alewives) whose populations are changing relative to environmental conditions and ecosystem disturbances. Remaining questions, however, are whether or not “natural” fluctuations in predator and prey provide enough “stability” in the Great Lakes food webs, and even more importantly, would a choice by managers to attempt to reduce the severity of predator-prey oscillations be antagonistic to native fish restoration efforts. We argue that, on each of the Great Lakes, managers are pursuing

  15. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  16. Time-Resolved Particle Image Velocimetry Measurements with Wall Shear Stress and Uncertainty Quantification for the FDA Nozzle Model.

    Science.gov (United States)

    Raben, Jaime S; Hariharan, Prasanna; Robinson, Ronald; Malinauskas, Richard; Vlachos, Pavlos P

    2016-03-01

    We present advanced particle image velocimetry (PIV) processing, post-processing, and uncertainty estimation techniques to support the validation of computational fluid dynamics analyses of medical devices. This work is an extension of a previous FDA-sponsored multi-laboratory study, which used a medical device mimicking geometry referred to as the FDA benchmark nozzle model. Experimental measurements were performed using time-resolved PIV at five overlapping regions of the model for Reynolds numbers in the nozzle throat of 500, 2000, 5000, and 8000. Images included a twofold increase in spatial resolution in comparison to the previous study. Data was processed using ensemble correlation, dynamic range enhancement, and phase correlations to increase signal-to-noise ratios and measurement accuracy, and to resolve flow regions with large velocity ranges and gradients, which is typical of many blood-contacting medical devices. Parameters relevant to device safety, including shear stress at the wall and in bulk flow, were computed using radial basis functions. In addition, in-field spatially resolved pressure distributions, Reynolds stresses, and energy dissipation rates were computed from PIV measurements. Velocity measurement uncertainty was estimated directly from the PIV correlation plane, and uncertainty analysis for wall shear stress at each measurement location was performed using a Monte Carlo model. Local velocity uncertainty varied greatly and depended largely on local conditions such as particle seeding, velocity gradients, and particle displacements. Uncertainty in low velocity regions in the sudden expansion section of the nozzle was greatly reduced by over an order of magnitude when dynamic range enhancement was applied. Wall shear stress uncertainty was dominated by uncertainty contributions from velocity estimations, which were shown to account for 90-99% of the total uncertainty. This study provides advancements in the PIV processing methodologies over

  17. Uncertainty in adaptive capacity

    International Nuclear Information System (INIS)

    Neil Adger, W.; Vincent, K.

    2005-01-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. (authors)

  18. Uncertainties about climate

    International Nuclear Information System (INIS)

    Laval, Katia; Laval, Guy

    2013-01-01

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  19. Visualizing Uncertainty of Point Phenomena by Redesigned Error Ellipses

    Science.gov (United States)

    Murphy, Christian E.

    2018-05-01

    Visualizing uncertainty remains one of the great challenges in modern cartography. There is no overarching strategy to display the nature of uncertainty, as an effective and efficient visualization depends, besides on the spatial data feature type, heavily on the type of uncertainty. This work presents a design strategy to visualize uncertainty con-nected to point features. The error ellipse, well-known from mathematical statistics, is adapted to display the uncer-tainty of point information originating from spatial generalization. Modified designs of the error ellipse show the po-tential of quantitative and qualitative symbolization and simultaneous point based uncertainty symbolization. The user can intuitively depict the centers of gravity, the major orientation of the point arrays as well as estimate the ex-tents and possible spatial distributions of multiple point phenomena. The error ellipse represents uncertainty in an intuitive way, particularly suitable for laymen. Furthermore it is shown how applicable an adapted design of the er-ror ellipse is to display the uncertainty of point features originating from incomplete data. The suitability of the error ellipse to display the uncertainty of point information is demonstrated within two showcases: (1) the analysis of formations of association football players, and (2) uncertain positioning of events on maps for the media.

  20. Research of Uncertainty Reasoning in Pineapple Disease Identification System

    Science.gov (United States)

    Liu, Liqun; Fan, Haifeng

    In order to deal with the uncertainty of evidences mostly existing in pineapple disease identification system, a reasoning model based on evidence credibility factor was established. The uncertainty reasoning method is discussed,including: uncertain representation of knowledge, uncertain representation of rules, uncertain representation of multi-evidences and update of reasoning rules. The reasoning can fully reflect the uncertainty in disease identification and reduce the influence of subjective factors on the accuracy of the system.

  1. The uncertainty principle

    International Nuclear Information System (INIS)

    Martens, Hans.

    1991-01-01

    The subject of this thesis is the uncertainty principle (UP). The UP is one of the most characteristic points of differences between quantum and classical mechanics. The starting point of this thesis is the work of Niels Bohr. Besides the discussion the work is also analyzed. For the discussion of the different aspects of the UP the formalism of Davies and Ludwig is used instead of the more commonly used formalism of Neumann and Dirac. (author). 214 refs.; 23 figs

  2. Uncertainty in artificial intelligence

    CERN Document Server

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  3. Decision Making Under Uncertainty

    Science.gov (United States)

    2010-11-01

    A sound approach to rational decision making requires a decision maker to establish decision objectives, identify alternatives, and evaluate those...often violate the axioms of rationality when making decisions under uncertainty. The systematic description of such observations may lead to the...which leads to “anchoring” on the initial value. The fact that individuals have been shown to deviate from rationality when making decisions

  4. Economic uncertainty principle?

    OpenAIRE

    Alexander Harin

    2006-01-01

    The economic principle of (hidden) uncertainty is presented. New probability formulas are offered. Examples of solutions of three types of fundamental problems are reviewed.; Principe d'incertitude économique? Le principe économique d'incertitude (cachée) est présenté. De nouvelles formules de chances sont offertes. Les exemples de solutions des trois types de problèmes fondamentaux sont reconsidérés.

  5. Citizen Candidates Under Uncertainty

    OpenAIRE

    Eguia, Jon X.

    2005-01-01

    In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...

  6. Calibration Under Uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  7. Participation under Uncertainty

    International Nuclear Information System (INIS)

    Boudourides, Moses A.

    2003-01-01

    This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an increasing uncertainty. But there are different categories and models of uncertainties implying a variety of configurations of policy processes. A particular role in all of them is played by expertise whose democratization is an often-claimed imperative nowadays. Moreover, we discuss how different participatory arrangements are shaped into instruments of policy-making and framing regulatory processes. As participation necessitates and triggers deliberation, we proceed to examine the role and the barriers of deliberativeness. Finally, we conclude by referring to some critical views about the ultimate assumptions of recent European policy frameworks and the conceptions of civic participation and politicization that they invoke

  8. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  9. State-independent uncertainty relations and entanglement detection

    Science.gov (United States)

    Qian, Chen; Li, Jun-Li; Qiao, Cong-Feng

    2018-04-01

    The uncertainty relation is one of the key ingredients of quantum theory. Despite the great efforts devoted to this subject, most of the variance-based uncertainty relations are state-dependent and suffering from the triviality problem of zero lower bounds. Here we develop a method to get uncertainty relations with state-independent lower bounds. The method works by exploring the eigenvalues of a Hermitian matrix composed by Bloch vectors of incompatible observables and is applicable for both pure and mixed states and for arbitrary number of N-dimensional observables. The uncertainty relation for the incompatible observables can be explained by geometric relations related to the parallel postulate and the inequalities in Horn's conjecture on Hermitian matrix sum. Practical entanglement criteria are also presented based on the derived uncertainty relations.

  10. Review: The Great Gatsby

    Directory of Open Access Journals (Sweden)

    Antonia de Jesus Sales

    2016-08-01

    Full Text Available A presente resenha busca discutir a tradução de The Great Gatsby para o contexto brasileiro. Diversas traduções foram feitas, em diversas épocas e com repercussão positiva no contexto brasileiro. Para o presente estudo, foi observada a tradução de Vanessa Bárbara, de 2011. Nesse sentido, o aspecto biográficos do autor e a forma como se apresentam os personagens na obra são fatores de cotejamento na obra original e na tradução brasileira. Francis Scott Key Fitzgerald (1896 – 1940 é famoso por ter em suas obras traços biográficos, algo que certamente influencia o leitor que adentra a sua obra. Quanto à recepção de O Grande Gatsby no contexto brasileiro, há que se considerar que O Grande Gatsby teve diversas traduções no Brasil. Depois dessa tradução de Vanessa Bárbara, em 2011, outras três vieram em 2013, juntamente com o filme. Há que considerar os aspectos comerciais embutidos nessas traduções e que muito corroboram para o resultado final. Prova disso são as capas, que são sempre diferenciadas em cada edição lançada. O tradutor nem sempre pode opinar sobre questões como estas. A tradução, a meu ver, é uma obra de qualidade, visto que a tradutora buscou ser fiel, sem dificultar a interpretação da obra para o leitor.

  11. Review: The Great Gatsby

    Directory of Open Access Journals (Sweden)

    Antonia de Jesus Sales

    2016-05-01

    Full Text Available A presente resenha busca discutir a tradução de The Great Gatsby para o contexto brasileiro. Diversas traduções foram feitas, em diversas épocas e com repercussão positiva no contexto brasileiro. Para o presente estudo, foi observada a tradução de Vanessa Bárbara, de 2011. Nesse sentido, o aspecto biográficos do autor e a forma como se apresentam os personagens na obra são fatores de cotejamento na obra original e na tradução brasileira. Francis Scott Key Fitzgerald (1896 – 1940 é famoso por ter em suas obras traços biográficos, algo que certamente influencia o leitor que adentra a sua obra. Quanto à recepção de O Grande Gatsby no contexto brasileiro, há que se considerar que O Grande Gatsby teve diversas traduções no Brasil. Depois dessa tradução de Vanessa Bárbara, em 2011, outras três vieram em 2013, juntamente com o filme. Há que considerar os aspectos comerciais embutidos nessas traduções e que muito corroboram para o resultado final. Prova disso são as capas, que são sempre diferenciadas em cada edição lançada. O tradutor nem sempre pode opinar sobre questões como estas. A tradução, a meu ver, é uma obra de qualidade, visto que a tradutora buscou ser fiel, sem dificultar a interpretação da obra para o leitor.

  12. Towards minimizing measurement uncertainty in total petroleum hydrocarbon determination by GC-FID

    Energy Technology Data Exchange (ETDEWEB)

    Saari, E.

    2009-07-01

    Despite tightened environmental legislation, spillages of petroleum products remain a serious problem worldwide. The environmental impacts of these spillages are always severe and reliable methods for the identification and quantitative determination of petroleum hydrocarbons in environmental samples are therefore needed. Great improvements in the definition and analysis of total petroleum hydrocarbons (TPH) were finally introduced by international organizations for standardization in 2004. This brought some coherence to the determination and, nowadays, most laboratories seem to employ ISO/DIS 16703:2004, ISO 9377-2:2000 and CEN prEN 14039:2004:E draft international standards for analysing TPH in soil. The implementation of these methods, however, usually fails because the reliability of petroleum hydrocarbon determination has proved to be poor.This thesis describes the assessment of measurement uncertainty for TPH determination in soil. Chemometric methods were used to both estimate the main uncertainty sources and identify the most significant factors affecting these uncertainty sources. The method used for the determinations was based on gas chromatography utilizing flame ionization detection (GC-FID).Chemometric methodology applied in estimating measurement uncertainty for TPH determination showed that the measurement uncertainty is in actual fact dominated by the analytical uncertainty. Within the specific concentration range studied, the analytical uncertainty accounted for as much as 68-80% of the measurement uncertainty. The robustness of the analytical method used for petroleum hydrocarbon determination was then studied in more detail. A two-level Plackett-Burman design and a D-optimal design were utilized to assess the main analytical uncertainty sources of the sample treatment and GC determination procedures. It was also found that the matrix-induced systematic error may also significantly reduce the reliability of petroleum hydrocarbon determination

  13. Great Lakes Environmental Database (GLENDA)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Great Lakes Environmental Database (GLENDA) houses environmental data on a wide variety of constituents in water, biota, sediment, and air in the Great Lakes area.

  14. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  15. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  16. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  17. On Commitments and Other Uncertainty Reduction Tools in Joint Action

    Directory of Open Access Journals (Sweden)

    Michael John

    2015-01-01

    Full Text Available In this paper, we evaluate the proposal that a central function of commitments within joint action is to reduce various kinds of uncertainty, and that this accounts for the prevalence of commitments in joint action. While this idea is prima facie attractive, we argue that it faces two serious problems. First, commitments can only reduce uncertainty if they are credible, and accounting for the credibility of commitments proves not to be straightforward. Second, there are many other ways in which uncertainty is commonly reduced within joint actions, which raises the possibility that commitments may be superfluous. Nevertheless, we argue that the existence of these alternative uncertainty reduction processes does not make commitments superfluous after all but, rather, helps to explain how commitments may contribute in various ways to uncertainty reduction.

  18. Attitudes, beliefs, uncertainty and risk

    Energy Technology Data Exchange (ETDEWEB)

    Greenhalgh, Geoffrey [Down Park Place, Crawley Down (United Kingdom)

    2001-07-01

    There is now unmistakable evidence of a widening split within the Western industrial nations arising from conflicting views of society; for and against change. The argument is over the benefits of 'progress' and growth. On one side are those who seek more jobs, more production and consumption, higher standards of living, an ever-increasing GNP with an increasing globalisation of production and welcome the advances of science and technology confident that any temporary problems that arise can be solved by further technological development - possible energy shortages as a growing population increases energy usage can be met by nuclear power development; food shortages by the increased yields of GM crops. In opposition are those who put the quality of life before GNP, advocate a more frugal life-style, reducing needs and energy consumption, and, pointing to the harm caused by increasing pollution, press for cleaner air and water standards. They seek to reduce the pressure of an ever-increasing population and above all to preserve the natural environment. This view is associated with a growing uncertainty as the established order is challenged with the rise in status of 'alternative' science and medicine. This paper argues that these conflicting views reflect instinctive attitudes. These in turn draw support from beliefs selected from those which uncertainty offers. Where there is scope for argument over the truth or validity of a 'fact', the choice of which of the disputed views to believe will be determined by a value judgement. This applies to all controversial social and political issues. Nuclear waste disposal and biotechnology are but two particular examples in the technological field; joining the EMU is a current political controversy where value judgements based on attitudes determine beliefs. When, or if, a controversy is finally resolved the judgement arrived at will be justified by the belief that the consequences of the course chosen will be more favourable

  19. Attitudes, beliefs, uncertainty and risk

    Energy Technology Data Exchange (ETDEWEB)

    Greenhalgh, Geoffrey [Down Park Place, Crawley Down (United Kingdom)

    2001-07-01

    There is now unmistakable evidence of a widening split within the Western industrial nations arising from conflicting views of society; for and against change. The argument is over the benefits of 'progress' and growth. On one side are those who seek more jobs, more production and consumption, higher standards of living, an ever-increasing GNP with an increasing globalisation of production and welcome the advances of science and technology confident that any temporary problems that arise can be solved by further technological development - possible energy shortages as a growing population increases energy usage can be met by nuclear power development; food shortages by the increased yields of GM crops. In opposition are those who put the quality of life before GNP, advocate a more frugal life-style, reducing needs and energy consumption, and, pointing to the harm caused by increasing pollution, press for cleaner air and water standards. They seek to reduce the pressure of an ever-increasing population and above all to preserve the natural environment. This view is associated with a growing uncertainty as the established order is challenged with the rise in status of 'alternative' science and medicine. This paper argues that these conflicting views reflect instinctive attitudes. These in turn draw support from beliefs selected from those which uncertainty offers. Where there is scope for argument over the truth or validity of a 'fact', the choice of which of the disputed views to believe will be determined by a value judgement. This applies to all controversial social and political issues. Nuclear waste disposal and biotechnology are but two particular examples in the technological field; joining the EMU is a current political controversy where value judgements based on attitudes determine beliefs. When, or if, a controversy is finally resolved the judgement arrived at will be justified by the belief that the consequences of the course

  20. Attitudes, beliefs, uncertainty and risk

    International Nuclear Information System (INIS)

    Greenhalgh, Geoffrey

    2001-01-01

    There is now unmistakable evidence of a widening split within the Western industrial nations arising from conflicting views of society; for and against change. The argument is over the benefits of 'progress' and growth. On one side are those who seek more jobs, more production and consumption, higher standards of living, an ever-increasing GNP with an increasing globalisation of production and welcome the advances of science and technology confident that any temporary problems that arise can be solved by further technological development - possible energy shortages as a growing population increases energy usage can be met by nuclear power development; food shortages by the increased yields of GM crops. In opposition are those who put the quality of life before GNP, advocate a more frugal life-style, reducing needs and energy consumption, and, pointing to the harm caused by increasing pollution, press for cleaner air and water standards. They seek to reduce the pressure of an ever-increasing population and above all to preserve the natural environment. This view is associated with a growing uncertainty as the established order is challenged with the rise in status of 'alternative' science and medicine. This paper argues that these conflicting views reflect instinctive attitudes. These in turn draw support from beliefs selected from those which uncertainty offers. Where there is scope for argument over the truth or validity of a 'fact', the choice of which of the disputed views to believe will be determined by a value judgement. This applies to all controversial social and political issues. Nuclear waste disposal and biotechnology are but two particular examples in the technological field; joining the EMU is a current political controversy where value judgements based on attitudes determine beliefs. When, or if, a controversy is finally resolved the judgement arrived at will be justified by the belief that the consequences of the course chosen will be more favourable

  1. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    NARCIS (Netherlands)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  2. Paradoxical effects of compulsive perseveration : Sentence repetition causes semantic uncertainty

    NARCIS (Netherlands)

    Giele, Catharina L.; van den Hout, Marcel A.; Engelhard, Iris M.; Dek, Eliane C P

    2014-01-01

    Many patients with obsessive compulsive disorder (OCD) perform perseverative checking behavior to reduce uncertainty, but studies have shown that this ironically increases uncertainty. Some patients also tend to perseveratively repeat sentences. The aim of this study was to examine whether sentence

  3. Data requirements of GREAT-ER: Modelling and validation using LAS in four UK catchments

    International Nuclear Information System (INIS)

    Price, Oliver R.; Munday, Dawn K.; Whelan, Mick J.; Holt, Martin S.; Fox, Katharine K.; Morris, Gerard; Young, Andrew R.

    2009-01-01

    Higher-tier environmental risk assessments on 'down-the-drain' chemicals in river networks can be conducted using models such as GREAT-ER (Geography-referenced Regional Exposure Assessment Tool for European Rivers). It is important these models are evaluated and their sensitivities to input variables understood. This study had two primary objectives: evaluate GREAT-ER model performance, comparing simulated modelled predictions for LAS (linear alkylbenzene sulphonate) with measured concentrations, for four rivers in the UK, and investigate model sensitivity to input variables. We demonstrate that the GREAT-ER model is very sensitive to variability in river discharges. However it is insensitive to the form of distributions used to describe chemical usage and removal rate in sewage treatment plants (STPs). It is concluded that more effort should be directed towards improving empirical estimates of effluent load and reducing uncertainty associated with usage and removal rates in STPs. Simulations could be improved by incorporating the effect of river depth on dissipation rates. - Validation of GREAT-ER.

  4. Data requirements of GREAT-ER: Modelling and validation using LAS in four UK catchments

    Energy Technology Data Exchange (ETDEWEB)

    Price, Oliver R., E-mail: oliver.price@unilever.co [Safety and Environmental Assurance Centre, Unilever, Colworth Science Park, Sharnbrook, Bedfordshire MK44 1LQ (United Kingdom); Munday, Dawn K. [Safety and Environmental Assurance Centre, Unilever, Colworth Science Park, Sharnbrook, Bedfordshire MK44 1LQ (United Kingdom); Whelan, Mick J. [Department of Natural Resources, School of Applied Sciences, Cranfield University, College Road, Cranfield, Bedfordshire MK43 0AL (United Kingdom); Holt, Martin S. [ECETOC, Ave van Nieuwenhuyse 4, Box 6, B-1160 Brussels (Belgium); Fox, Katharine K. [85 Park Road West, Birkenhead, Merseyside CH43 8SQ (United Kingdom); Morris, Gerard [Environment Agency, Phoenix House, Global Avenue, Leeds LS11 8PG (United Kingdom); Young, Andrew R. [Wallingford HydroSolutions Ltd, Maclean building, Crowmarsh Gifford, Wallingford, Oxon OX10 8BB (United Kingdom)

    2009-10-15

    Higher-tier environmental risk assessments on 'down-the-drain' chemicals in river networks can be conducted using models such as GREAT-ER (Geography-referenced Regional Exposure Assessment Tool for European Rivers). It is important these models are evaluated and their sensitivities to input variables understood. This study had two primary objectives: evaluate GREAT-ER model performance, comparing simulated modelled predictions for LAS (linear alkylbenzene sulphonate) with measured concentrations, for four rivers in the UK, and investigate model sensitivity to input variables. We demonstrate that the GREAT-ER model is very sensitive to variability in river discharges. However it is insensitive to the form of distributions used to describe chemical usage and removal rate in sewage treatment plants (STPs). It is concluded that more effort should be directed towards improving empirical estimates of effluent load and reducing uncertainty associated with usage and removal rates in STPs. Simulations could be improved by incorporating the effect of river depth on dissipation rates. - Validation of GREAT-ER.

  5. Managing Measurement Uncertainty in Building Acoustics

    Directory of Open Access Journals (Sweden)

    Chiara Scrosati

    2015-12-01

    Full Text Available In general, uncertainties should preferably be determined following the principles laid down in ISO/IEC Guide 98-3, the Guide to the expression of uncertainty in measurement (GUM:1995. According to current knowledge, it seems impossible to formulate these models for the different quantities in building acoustics. Therefore, the concepts of repeatability and reproducibility are necessary to determine the uncertainty of building acoustics measurements. This study shows the uncertainty of field measurements of a lightweight wall, a heavyweight floor, a façade with a single glazing window and a façade with double glazing window that were analyzed by a Round Robin Test (RRT, conducted in a full-scale experimental building at ITC-CNR (Construction Technologies Institute of the National Research Council of Italy. The single number quantities and their uncertainties were evaluated in both narrow and enlarged range and it was shown that including or excluding the low frequencies leads to very significant differences, except in the case of the sound insulation of façades with single glazing window. The results obtained in these RRTs were compared with other results from literature, which confirm the increase of the uncertainty of single number quantities due to the low frequencies extension. Having stated the measurement uncertainty for a single measurement, in building acoustics, it is also very important to deal with sampling for the purposes of classification of buildings or building units. Therefore, this study also shows an application of the sampling included in the Italian Standard on the acoustic classification of building units on a serial type building consisting of 47 building units. It was found that the greatest variability is observed in the façade and it depends on both the great variability of window’s typologies and on workmanship. Finally, it is suggested how to manage the uncertainty in building acoustics, both for one single

  6. Integrating uncertainties for climate change mitigation

    Science.gov (United States)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    geophysical, future energy demand, and mitigation technology uncertainties. This information provides central information for policy making, since it helps to understand the relationship between mitigation costs and their potential to reduce the risk of exceeding 2°C, or other temperature limits like 3°C or 1.5°C, under a wide range of scenarios.

  7. Assessing flood forecast uncertainty with fuzzy arithmetic

    Directory of Open Access Journals (Sweden)

    de Bruyn Bertrand

    2016-01-01

    Full Text Available Providing forecasts for flow rates and water levels during floods have to be associated with uncertainty estimates. The forecast sources of uncertainty are plural. For hydrological forecasts (rainfall-runoff performed using a deterministic hydrological model with basic physics, two main sources can be identified. The first obvious source is the forcing data: rainfall forecast data are supplied in real time by meteorological forecasting services to the Flood Forecasting Service within a range between a lowest and a highest predicted discharge. These two values define an uncertainty interval for the rainfall variable provided on a given watershed. The second source of uncertainty is related to the complexity of the modeled system (the catchment impacted by the hydro-meteorological phenomenon, the number of variables that may describe the problem and their spatial and time variability. The model simplifies the system by reducing the number of variables to a few parameters. Thus it contains an intrinsic uncertainty. This model uncertainty is assessed by comparing simulated and observed rates for a large number of hydro-meteorological events. We propose a method based on fuzzy arithmetic to estimate the possible range of flow rates (and levels of water making a forecast based on possible rainfalls provided by forcing and uncertainty model. The model uncertainty is here expressed as a range of possible values. Both rainfall and model uncertainties are combined with fuzzy arithmetic. This method allows to evaluate the prediction uncertainty range. The Flood Forecasting Service of Oise and Aisne rivers, in particular, monitors the upstream watershed of the Oise at Hirson. This watershed’s area is 310 km2. Its response time is about 10 hours. Several hydrological models are calibrated for flood forecasting in this watershed and use the rainfall forecast. This method presents the advantage to be easily implemented. Moreover, it permits to be carried out

  8. Optimization under Uncertainty

    KAUST Repository

    Lopez, Rafael H.

    2016-01-06

    The goal of this poster is to present the main approaches to optimization of engineering systems in the presence of uncertainties. We begin by giving an insight about robust optimization. Next, we detail how to deal with probabilistic constraints in optimization, the so called the reliability based design. Subsequently, we present the risk optimization approach, which includes the expected costs of failure in the objective function. After that the basic description of each approach is given, the projects developed by CORE are presented. Finally, the main current topic of research of CORE is described.

  9. Optimizing production under uncertainty

    DEFF Research Database (Denmark)

    Rasmussen, Svend

    This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...

  10. Commonplaces and social uncertainty

    DEFF Research Database (Denmark)

    Lassen, Inger

    2008-01-01

    This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...... an example of risk discourse in which the use of commonplaces seems to be a central feature (Myers 2004: 81). My analyses support earlier findings that commonplaces serve important interactional purposes (Barton 1999) and that they are used for mitigating disagreement, for closing topics and for facilitating...

  11. Principles of Uncertainty

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  12. Mathematical Analysis of Uncertainty

    Directory of Open Access Journals (Sweden)

    Angel GARRIDO

    2016-01-01

    Full Text Available Classical Logic showed early its insufficiencies for solving AI problems. The introduction of Fuzzy Logic aims at this problem. There have been research in the conventional Rough direction alone or in the Fuzzy direction alone, and more recently, attempts to combine both into Fuzzy Rough Sets or Rough Fuzzy Sets. We analyse some new and powerful tools in the study of Uncertainty, as the Probabilistic Graphical Models, Chain Graphs, Bayesian Networks, and Markov Networks, integrating our knowledge of graphs and probability.

  13. Climate policy uncertainty and investment risk

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-06-21

    Our climate is changing. This is certain. Less certain, however, is the timing and magnitude of climate change, and the cost of transition to a low-carbon world. Therefore, many policies and programmes are still at a formative stage, and policy uncertainty is very high. This book identifies how climate change policy uncertainty may affect investment behaviour in the power sector. For power companies, where capital stock is intensive and long-lived, those risks rank among the biggest and can create an incentive to delay investment. Our analysis results show that the risk premiums of climate change uncertainty can add 40% of construction costs of the plant for power investors, and 10% of price surcharges for the electricity end-users. This publication tells what can be done in policy design to reduce these costs. Incorporating the results of quantitative analysis, this publication also shows the sensitivity of different power sector investment decisions to different risks. It compares the effects of climate policy uncertainty with energy market uncertainty, showing the relative importance of these sources of risk for different technologies in different market types. Drawing on extensive consultation with power companies and financial investors, it also assesses the implications for policy makers, allowing the key messages to be transferred into policy designs. This book is a useful tool for governments to improve climate policy mechanisms and create more certainty for power investors.

  14. Uncertainty Regarding Waste Handling in Everyday Life

    Directory of Open Access Journals (Sweden)

    Susanne Ewert

    2010-09-01

    Full Text Available According to our study, based on interviews with households in a residential area in Sweden, uncertainty is a cultural barrier to improved recycling. Four causes of uncertainty are identified. Firstly, professional categories not matching cultural categories—people easily discriminate between certain categories (e.g., materials such as plastic and paper but not between others (e.g., packaging and “non-packaging”. Thus a frequent cause of uncertainty is that the basic categories of the waste recycling system do not coincide with the basic categories used in everyday life. Challenged habits—source separation in everyday life is habitual, but when a habit is challenged, by a particular element or feature of the waste system, uncertainty can arise. Lacking fractions—some kinds of items cannot be left for recycling and this makes waste collection incomplete from the user’s point of view and in turn lowers the credibility of the system. Missing or contradictory rules of thumb—the above causes seem to be particularly relevant if no motivating principle or rule of thumb (within the context of use is successfully conveyed to the user. This paper discusses how reducing uncertainty can improve recycling.

  15. Investment, regulation, and uncertainty

    Science.gov (United States)

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  16. Probabilistic Mass Growth Uncertainties

    Science.gov (United States)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  17. Incorporating parametric uncertainty into population viability analysis models

    Science.gov (United States)

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  18. Assessing concentration uncertainty estimates from passive microwave sea ice products

    Science.gov (United States)

    Meier, W.; Brucker, L.; Miller, J. A.

    2017-12-01

    Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.

  19. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    International Nuclear Information System (INIS)

    Wagner, Ryan; Raman, Arvind; Moon, Robert; Pratt, Jon; Shaw, Gordon

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7–20 GPa. A key result is that multiple replicates of force–distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials.

  20. Embracing uncertainty in applied ecology.

    Science.gov (United States)

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  1. Oil price uncertainty in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  2. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  3. Uncertainty estimation of a complex water quality model: The influence of Box-Cox transformation on Bayesian approaches and comparison with a non-Bayesian method

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio

    In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the

  4. Intimate Partner Violence in the Great Recession.

    Science.gov (United States)

    Schneider, Daniel; Harknett, Kristen; McLanahan, Sara

    2016-04-01

    In the United States, the Great Recession was marked by severe negative shocks to labor market conditions. In this study, we combine longitudinal data from the Fragile Families and Child Wellbeing Study with U.S. Bureau of Labor Statistics data on local area unemployment rates to examine the relationship between adverse labor market conditions and mothers' experiences of abusive behavior between 2001 and 2010. Unemployment and economic hardship at the household level were positively related to abusive behavior. Further, rapid increases in the unemployment rate increased men's controlling behavior toward romantic partners even after we adjust for unemployment and economic distress at the household level. We interpret these findings as demonstrating that the uncertainty and anticipatory anxiety that go along with sudden macroeconomic downturns have negative effects on relationship quality, above and beyond the effects of job loss and material hardship.

  5. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  6. Uncertainty as Certaint

    Science.gov (United States)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  7. Orientation and uncertainties

    International Nuclear Information System (INIS)

    Peters, H.P.; Hennen, L.

    1990-01-01

    The authors report on the results of three representative surveys that made a closer inquiry into perceptions and valuations of information and information sources concering Chernobyl. If turns out that the information sources are generally considered little trustworthy. This was generally attributable to the interpretation of the events being tied to attitudes in the atmonic energy issue. The greatest credit was given to television broadcasting. The authors summarize their discourse as follows: There is good reason to interpret the widespread uncertainty after Chernobyl as proof of the fact that large parts of the population are prepared and willing to assume a critical stance towards information and prefer to draw their information from various sources representing different positions. (orig.) [de

  8. DOD ELAP Lab Uncertainties

    Science.gov (United States)

    2012-03-01

    ISO / IEC   17025  Inspection Bodies – ISO / IEC  17020  RMPs – ISO  Guide 34 (Reference...certify to :  ISO  9001 (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO / IEC   17025 :2005  Each has uncertainty...IPV6, NLLAP, NEFAP  TRAINING Programs  Certification Bodies – ISO / IEC  17021  Accreditation for  Management System 

  9. Traceability and Measurement Uncertainty

    DEFF Research Database (Denmark)

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    . The project partnership aims (composed by 7 partners in 5 countries, thus covering a real European spread in high tech production technology) to develop and implement an advanced e-learning system that integrates contributions from quite different disciplines into a user-centred approach that strictly....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen...

  10. Decision making under uncertainty

    International Nuclear Information System (INIS)

    Cyert, R.M.

    1989-01-01

    This paper reports on ways of improving the reliability of products and systems in this country if we are to survive as a first-rate industrial power. The use of statistical techniques have, since the 1920s, been viewed as one of the methods for testing quality and estimating the level of quality in a universe of output. Statistical quality control is not relevant, generally, to improving systems in an industry like yours, but certainly the use of probability concepts is of significance. In addition, when it is recognized that part of the problem involves making decisions under uncertainty, it becomes clear that techniques such as sequential decision making and Bayesian analysis become major methodological approaches that must be utilized

  11. Sustainability and uncertainty

    DEFF Research Database (Denmark)

    Jensen, Karsten Klint

    2007-01-01

    The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one's mind about a range of difficult questions. One line of research (bottom-up) takes sustaining a system over time as its starting point and then infers prescriptions from...... this requirement. Another line (top-down) takes an economical interpretation of the Brundtland Commission's suggestion that the present generation's needsatisfaction should not compromise the need-satisfaction of future generations as its starting point. It then measures sustainability at the level of society...... a clarified ethical goal, disagreements can arise. At present we do not know what substitutions will be possible in the future. This uncertainty clearly affects the prescriptions that follow from the measure of sustainability. Consequently, decisions about how to make future agriculture sustainable...

  12. Uncertainty analysis in comparative NAA applied to geological and biological matrices

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Ticianelli, Regina B.; Lange, Camila N.; Favaro, Deborah I.T.; Figueiredo, Ana M.G., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    Comparative nuclear activation analysis is a multielemental primary analytical technique that may be used in a rather broad spectrum of matrices with minimal-to-none sample preprocessing. Although the total activation of a chemical element in a sample depends on a rather large set of parameters, when the sample is irradiated together with a well-known comparator, most of these parameters are crossed out and the concentration of that element can be determined simply by using the activities and masses of the comparator and the sample, the concentration of this chemical element in the sample, the half-life of the formed radionuclide and the time between counting the sample and the comparator. This simplification greatly reduces not only the calculations required, but also the uncertainty associated with the measurement; nevertheless, a cautious analysis must be carried out in order to make sure all relevant uncertainties are properly treated, so that the final result can be as representative of the measurement as possible. In this work, this analysis was performed for geological matrices, where concentrations of the interest nuclides are rather high, but so is the density and average atomic number of the sample, as well as for a biological matrix, in order to allow for a comparison. The results show that the largest part of the uncertainty comes from the activity measurements and from the concentration of the comparator, and that while the influence of time-related terms in the final uncertainty can be safely neglected, the uncertainty in the masses may be relevant under specific circumstances. (author)

  13. Uncertainty analysis in comparative NAA applied to geological and biological matrices

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Ticianelli, Regina B.; Lange, Camila N.; Favaro, Deborah I.T.; Figueiredo, Ana M.G.

    2015-01-01

    Comparative nuclear activation analysis is a multielemental primary analytical technique that may be used in a rather broad spectrum of matrices with minimal-to-none sample preprocessing. Although the total activation of a chemical element in a sample depends on a rather large set of parameters, when the sample is irradiated together with a well-known comparator, most of these parameters are crossed out and the concentration of that element can be determined simply by using the activities and masses of the comparator and the sample, the concentration of this chemical element in the sample, the half-life of the formed radionuclide and the time between counting the sample and the comparator. This simplification greatly reduces not only the calculations required, but also the uncertainty associated with the measurement; nevertheless, a cautious analysis must be carried out in order to make sure all relevant uncertainties are properly treated, so that the final result can be as representative of the measurement as possible. In this work, this analysis was performed for geological matrices, where concentrations of the interest nuclides are rather high, but so is the density and average atomic number of the sample, as well as for a biological matrix, in order to allow for a comparison. The results show that the largest part of the uncertainty comes from the activity measurements and from the concentration of the comparator, and that while the influence of time-related terms in the final uncertainty can be safely neglected, the uncertainty in the masses may be relevant under specific circumstances. (author)

  14. Bayesian uncertainty assessment of flood predictions in ungauged urban basins for conceptual rainfall-runoff models

    Directory of Open Access Journals (Sweden)

    A. E. Sikorska

    2012-04-01

    Full Text Available Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced to 5 times with Bayesian updating, using only few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.

  15. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  16. The Great London Smog of 1952.

    Science.gov (United States)

    Polivka, Barbara J

    2018-04-01

    : The Great London Smog of December 1952 lasted five days and killed up to 12,000 people. The smog developed primarily because of extensive burning of high-sulfur coal. The health effects were both immediate and long lasting, with a recent study revealing an increased likelihood of childhood asthma development in those exposed to the Great Smog while in utero or during their first year of life. Subsequent pollution legislation-including the U.S. Clean Air Act and its amendments-have demonstrably reduced air pollution and positively impacted health outcomes. With poor air quality events like the Great Smog continuing to occur today, nurses need to be aware of the impact such environmental disasters can have on human health.

  17. ["Great jobs"-also in psychiatry?].

    Science.gov (United States)

    Spiessl, H; Hübner-Liebermann, B

    2003-09-01

    Against the background of a beginning shortage of psychiatrists, results from interviews with 112 employees of an automotive company with the topic "Great Job" are presented to discuss their relevance to psychiatry. The interviews were analysed by means of a qualitative content analysis. Most employees assigned importance to great pay, constructive collaboration with colleagues, and work appealing to personal interests. Further statements particularly relevant to psychiatry were: successful career, flexible working hours, manageable job, work-life balance, well-founded training, no bureaucracy within the company, and personal status in society. The well-known economic restrictions in health care and the still negative attitude towards psychiatry currently reduce the attraction of psychiatry as a profession. From the viewpoint of personnel management, the attractors of a great job revealed in this study are proposed as important clues for the recruitment of medical students for psychiatry and the development of psychiatric staff.

  18. Essays on model uncertainty in financial models

    NARCIS (Netherlands)

    Li, Jing

    2018-01-01

    This dissertation studies model uncertainty, particularly in financial models. It consists of two empirical chapters and one theoretical chapter. The first empirical chapter (Chapter 2) classifies model uncertainty into parameter uncertainty and misspecification uncertainty. It investigates the

  19. Uncertainty-embedded dynamic life cycle sustainability assessment framework: An ex-ante perspective on the impacts of alternative vehicle options

    International Nuclear Information System (INIS)

    Onat, Nuri Cihat; Kucukvar, Murat; Tatari, Omer

    2016-01-01

    Alternative vehicle technologies have a great potential to minimize the transportation-related environmental impacts, reduce the reliance of the U.S. on imported petroleum, and increase energy security. However, they introduce new uncertainties related to their environmental, economic, and social impacts and certain challenges for widespread adoption. In this study, a novel method, uncertainty-embedded dynamic life cycle sustainability assessment framework, is developed to address both methodological challenges and uncertainties in transportation sustainability research. The proposed approach provides a more comprehensive, system-based sustainability assessment framework by capturing the dynamic relations among the parameters within the U.S. transportation system as a whole with respect to its environmental, social, and economic impacts. Using multivariate uncertainty analysis, likelihood of the impact reduction potentials of different vehicle types, as well as the behavioral limits of the sustainability potentials of each vehicle type are analyzed. Seven sustainability impact categories are dynamically quantified for four different vehicle types (internal combustion, hybrid, plug-in hybrid, and battery electric vehicles) from 2015 to 2050. Although impacts of electric vehicles have the largest uncertainty, they are expected (90% confidence) to be the best alternative in long-term for reducing human health impacts and air pollution from transportation. While results based on deterministic (average) values indicate that electric vehicles have greater potential of reducing greenhouse gas emissions, plug-in hybrid vehicles have the largest potential according to the results with 90% confidence interval. - Highlights: • Uncertainty-embedded dynamic sustainability assessment framework, is developed. • Methodological challenges and uncertainties are addressed. • Seven impact categories are quantified for four different vehicle types.

  20. Technical note: Design flood under hydrological uncertainty

    Science.gov (United States)

    Botto, Anna; Ganora, Daniele; Claps, Pierluigi; Laio, Francesco

    2017-07-01

    Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost-benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.

  1. A new uncertainty importance measure

    International Nuclear Information System (INIS)

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  2. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  3. Additivity of entropic uncertainty relations

    Directory of Open Access Journals (Sweden)

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  4. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  5. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib; Galassi, R. Malpica; Valorani, M.

    2016-01-01

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  6. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  7. The Uncertainty of Measurement Results

    Energy Technology Data Exchange (ETDEWEB)

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  8. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  9. What Caused the Great Depression?

    Science.gov (United States)

    Caldwell, Jean; O'Driscoll, Timothy G.

    2007-01-01

    Economists and historians have struggled for almost 80 years to account for the American Great Depression, which began in 1929 and lasted until the early years of World War II. In this article, the authors discuss three major schools of thought on the causes of the Great Depression and the long failure of the American economy to return to full…

  10. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  11. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  12. Simplified propagation of standard uncertainties

    International Nuclear Information System (INIS)

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  13. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  14. Resolving structural uncertainty in natural resources management using POMDP approaches

    Science.gov (United States)

    Williams, B.K.

    2011-01-01

    In recent years there has been a growing focus on the uncertainties of natural resources management, and the importance of accounting for uncertainty in assessing management effectiveness. This paper focuses on uncertainty in resource management in terms of discrete-state Markov decision processes (MDP) under structural uncertainty and partial observability. It describes the treatment of structural uncertainty with approaches developed for partially observable resource systems. In particular, I show how value iteration for partially observable MDPs (POMDP) can be extended to structurally uncertain MDPs. A key difference between these process classes is that structurally uncertain MDPs require the tracking of system state as well as a probability structure for the structure uncertainty, whereas with POMDPs require only a probability structure for the observation uncertainty. The added complexity of the optimization problem under structural uncertainty is compensated by reduced dimensionality in the search for optimal strategy. A solution algorithm for structurally uncertain processes is outlined for a simple example in conservation biology. By building on the conceptual framework developed for POMDPs, natural resource analysts and decision makers who confront structural uncertainties in natural resources can take advantage of the rapid growth in POMDP methods and approaches, and thereby produce better conservation strategies over a larger class of resource problems. ?? 2011.

  15. Climate Certainties and Uncertainties

    International Nuclear Information System (INIS)

    Morel, Pierre

    2012-01-01

    In issue 380 of Futuribles in December 2011, Antonin Pottier analysed in detail the workings of what is today termed 'climate scepticism' - namely the propensity of certain individuals to contest the reality of climate change on the basis of pseudo-scientific arguments. He emphasized particularly that what fuels the debate on climate change is, largely, the degree of uncertainty inherent in the consequences to be anticipated from observation of the facts, not the description of the facts itself. In his view, the main aim of climate sceptics is to block the political measures for combating climate change. However, since they do not admit to this political posture, they choose instead to deny the scientific reality. This month, Futuribles complements this socio-psychological analysis of climate-sceptical discourse with an - in this case, wholly scientific - analysis of what we know (or do not know) about climate change on our planet. Pierre Morel gives a detailed account of the state of our knowledge in the climate field and what we are able to predict in the medium/long-term. After reminding us of the influence of atmospheric meteorological processes on the climate, he specifies the extent of global warming observed since 1850 and the main origin of that warming, as revealed by the current state of knowledge: the increase in the concentration of greenhouse gases. He then describes the changes in meteorological regimes (showing also the limits of climate simulation models), the modifications of hydrological regimes, and also the prospects for rises in sea levels. He also specifies the mechanisms that may potentially amplify all these phenomena and the climate disasters that might ensue. Lastly, he shows what are the scientific data that cannot be disregarded, the consequences of which are now inescapable (melting of the ice-caps, rises in sea level etc.), the only remaining uncertainty in this connection being the date at which these things will happen. 'In this

  16. Key uncertainties in climate change policy: Results from ICAM-2

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to: inform decision makers about the likely outcome of policy initiatives; and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.0. This model includes demographics, economic activities, emissions, atmospheric chemistry, climate change, sea level rise and other impact modules and the numerous associated feedbacks. The model has over 700 objects of which over 1/3 are uncertain. These have been grouped into seven different classes of uncertain items. The impact of uncertainties in each of these items can be considered individually or in combinations with the others. In this paper we demonstrate the relative contribution of various sources of uncertainty to different outcomes in the model. The analysis shows that climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. Extreme uncertainties in indirect aerosol forcing and behavioral response to climate change (adaptation) were characterized by using bounding analyses; the results suggest that these extreme uncertainties can dominate the choice of policy outcomes.

  17. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    into account the main error sources for the measurement. This method has the potential to deal with all kinds of systematic and random errors that influence a dimensional CT measurement. A case study demonstrates the practical application of the VCT simulator using numerically generated CT data and statistical......The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...

  18. Sketching Uncertainty into Simulations.

    Science.gov (United States)

    Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E

    2012-12-01

    In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.

  19. Big data uncertainties.

    Science.gov (United States)

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  20. Uncertainty enabled Sensor Observation Services

    Science.gov (United States)

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  1. Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Khuwaileh, B.A., E-mail: bakhuwai@ncsu.edu; Abdel-Khalik, H.S.

    2015-01-15

    Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.

  2. A commentary on model uncertainty

    International Nuclear Information System (INIS)

    Apostolakis, G.

    1994-01-01

    A framework is proposed for the identification of model and parameter uncertainties in risk assessment models. Two cases are distinguished; in the first case, a set of mutually exclusive and exhaustive hypotheses (models) can be formulated, while, in the second, only one reference model is available. The relevance of this formulation to decision making and the communication of uncertainties is discussed

  3. Mama Software Features: Uncertainty Testing

    Energy Technology Data Exchange (ETDEWEB)

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  4. Designing for Uncertainty: Three Approaches

    Science.gov (United States)

    Bennett, Scott

    2007-01-01

    Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described…

  5. Costs of travel time uncertainty and benefits of travel time information: Conceptual model and numerical examples

    NARCIS (Netherlands)

    Ettema, D.F.; Timmermans, H.J.P.

    2006-01-01

    A negative effect of congestion that tends to be overlooked is travel time uncertainty. Travel time uncertainty causes scheduling costs due to early or late arrival. The negative effects of travel time uncertainty can be reduced by providing travellers with travel time information, which improves

  6. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  7. Great Lakes Environmental Research Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — NOAA-GLERL and its partners conduct innovative research on the dynamic environments and ecosystems of the Great Lakes and coastal regions to provide information for...

  8. What Caused the Great Recession?

    OpenAIRE

    Homburg, Stefan

    2014-01-01

    This paper examines five possible explanations for the Great Recession of 2008 and 2009, using data for the United States and the eurozone. Of these five hypotheses, four are not supported by the data, while the fifth appears reasonable.

  9. Arthroscopy of the great toe

    NARCIS (Netherlands)

    Frey, C.; van Dijk, C. N.

    1999-01-01

    The few available reports of arthroscopic treatment of the first MTP joint in the literature indicate favorable outcome. However, arthroscopy of the great toe is an advanced technique and should only be undertaken by experienced surgeons

  10. Reducing the extinction risk of stochastic populations via nondemographic noise

    Science.gov (United States)

    Be'er, Shay; Assaf, Michael

    2018-02-01

    We consider nondemographic noise in the form of uncertainty in the reaction step size and reveal a dramatic effect this noise may have on the stability of self-regulating populations. Employing the reaction scheme m A →k A but allowing, e.g., the product number k to be a priori unknown and sampled from a given distribution, we show that such nondemographic noise can greatly reduce the population's extinction risk compared to the fixed k case. Our analysis is tested against numerical simulations, and by using empirical data of different species, we argue that certain distributions may be more evolutionary beneficial than others.

  11. The Sixth Great Mass Extinction

    Science.gov (United States)

    Wagler, Ron

    2012-01-01

    Five past great mass extinctions have occurred during Earth's history. Humanity is currently in the midst of a sixth, human-induced great mass extinction of plant and animal life (e.g., Alroy 2008; Jackson 2008; Lewis 2006; McDaniel and Borton 2002; Rockstrom et al. 2009; Rohr et al. 2008; Steffen, Crutzen, and McNeill 2007; Thomas et al. 2004;…

  12. Alpine grassland soil organic carbon stock and its uncertainty in the three rivers source region of the Tibetan Plateau.

    Directory of Open Access Journals (Sweden)

    Xiaofeng Chang

    Full Text Available Alpine grassland of the Tibetan Plateau is an important component of global soil organic carbon (SOC stocks, but insufficient field observations and large spatial heterogeneity leads to great uncertainty in their estimation. In the Three Rivers Source Region (TRSR, alpine grasslands account for more than 75% of the total area. However, the regional carbon (C stock estimate and their uncertainty have seldom been tested. Here we quantified the regional SOC stock and its uncertainty using 298 soil profiles surveyed from 35 sites across the TRSR during 2006-2008. We showed that the upper soil (0-30 cm depth in alpine grasslands of the TRSR stores 2.03 Pg C, with a 95% confidence interval ranging from 1.25 to 2.81 Pg C. Alpine meadow soils comprised 73% (i.e. 1.48 Pg C of the regional SOC estimate, but had the greatest uncertainty at 51%. The statistical power to detect a deviation of 10% uncertainty in grassland C stock was less than 0.50. The required sample size to detect this deviation at a power of 90% was about 6-7 times more than the number of sample sites surveyed. Comparison of our observed SOC density with the corresponding values from the dataset of Yang et al. indicates that these two datasets are comparable. The combined dataset did not reduce the uncertainty in the estimate of the regional grassland soil C stock. This result could be mainly explained by the underrepresentation of sampling sites in large areas with poor accessibility. Further research to improve the regional SOC stock estimate should optimize sampling strategy by considering the number of samples and their spatial distribution.

  13. Uncertainty analysis of accident notification time and emergency medical service response time in work zone traffic accidents.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian

    2013-01-01

    Taking into account the uncertainty caused by exogenous factors, the accident notification time (ANT) and emergency medical service (EMS) response time were modeled as 2 random variables following the lognormal distribution. Their mean values and standard deviations were respectively formulated as the functions of environmental variables including crash time, road type, weekend, holiday, light condition, weather, and work zone type. Work zone traffic accident data from the Fatality Analysis Report System between 2002 and 2009 were utilized to determine the distributions of the ANT and the EMS arrival time in the United States. A mixed logistic regression model, taking into account the uncertainty associated with the ANT and the EMS response time, was developed to estimate the risk of death. The results showed that the uncertainty of the ANT was primarily influenced by crash time and road type, whereas the uncertainty of EMS response time is greatly affected by road type, weather, and light conditions. In addition, work zone accidents occurring during a holiday and in poor light conditions were found to be statistically associated with a longer mean ANT and longer EMS response time. The results also show that shortening the ANT was a more effective approach in reducing the risk of death than the EMS response time in work zones. To shorten the ANT and the EMS response time, work zone activities are suggested to be undertaken during non-holidays, during the daytime, and in good weather and light conditions.

  14. Uncertainty in BMP evaluation and optimization for watershed management

    Science.gov (United States)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT

  15. Uncertainty assessing of measure result of tungsten in U3O8 by ICP-AES

    International Nuclear Information System (INIS)

    Du Guirong; Nie Jie; Tang Lilei

    2011-01-01

    According as the determining method and the assessing criterion,the uncertainty assessing of measure result of tungsten in U 3 O 8 by ICP-AES is researched. With the assessment of each component in detail, the result shows that u rel (sc)> u rel (c)> u rel (F)> u rel (m) by uncertainty contribution. Other uncertainty is random, calculated by repetition. u rel (sc) is contributed to uncertainty mainly. So the general uncertainty is reduced with strict operation to reduce u rel (sc). (authors)

  16. Scientific uncertainties and climate risks

    International Nuclear Information System (INIS)

    Petit, M.

    2005-01-01

    Human activities have induced a significant change in the Earth's atmospheric composition and, most likely, this trend will increase throughout the coming decades. During the last decades, the mean temperature has actually increased by the expected amount. Moreover, the geographical distribution of the warming, and day-to-night temperature variation have evolved as predicted. The magnitude of those changes is relatively small for the time being, but is expected to increase alarmingly during the coming decades. Greenhouse warming is a representative example of the problems of sustainable development: long-term risks can be estimated on a rational basis from scientific laws alone, but the non-specialist is generally not prepared to understand the steps required. However, even the non-specialist has obviously the right to decide about his way of life and the inheritance that he would like to leave for his children, but it is preferable that he is fully informed before making his decisions. Dialog, mutual understanding and confidence must prevail between Science and Society to avoid irrational actions. Controversy among experts is quite frequent. In the case of greenhouse warming, a commendable collective expertise has drastically reduced possible confusion. The Intergovernmental Panel on Climate Change was created jointly by the World Meteorology Organization (WMO) and the UN Program for the Environment (UNEP). Its reports evaluate the state of knowledge on past and future global climate changes, their impact, and the possibility of controlling anthropogenic emissions. The main targeted readers are, nevertheless, non-specialists, who should be made aware of results deduced from approaches that they may not be able to follow step by step. Moreover, these results, in particular, future projections, are, and will remain, subject to some uncertainty, which a fair description of the state of knowledge must include. Many misunderstandings between writers and readers can

  17. Reducing uncertainties about the effects of chemoradiotherapy for cervical cancer:

    DEFF Research Database (Denmark)

    Vale, Claire; Jakobsen, Anders

    2008-01-01

    BACKGROUND: After a 1999 National Cancer Institute (NCI) clinical alert was issued, chemoradiotherapy has become widely used in treating women with cervical cancer. Two subsequent systematic reviews found that interpretation of the benefits was complicated, and some important clinical questions...

  18. Reducing Uncertainties in Hydrocarbon Prediction through Application of Elastic Domain

    Science.gov (United States)

    Shamsuddin, S. Z.; Hermana, M.; Ghosh, D. P.; Salim, A. M. A.

    2017-10-01

    The application of lithology and fluid indicators has helped the geophysicists to discriminate reservoirs to non-reservoirs from a field. This analysis is conducted to select the most suitable lithology and fluid indicator for the Malaysian basins that could lead to better eliminate pitfalls of amplitude. This paper uses different rock physics analysis such as elastic impedance, Lambda-Mu-Rho, and SQp-SQs attribute. Litho-elastic impedance log is generated by correlating the gamma ray log with extended elastic impedance log. The same application is used for fluid-elastic impedance by correlation of EEI log with water saturation or resistivity. The work is done on several well logging data collected from different fields in Malay basin and its neighbouring basin. There's an excellent separation between hydrocarbon sand and background shale for Well-1 from different cross-plot analysis. Meanwhile, the Well-2 shows good separation in LMR plot. The similar method is done on the Well-3 shows fair separation of silty sand and gas sand using SQp-SQs attribute which can be correlated with well log. Based on the point distribution histogram plot, different lithology and fluid can be separated clearly. Simultaneous seismic inversion results in acoustic impedance, Vp/Vs, SQp, and SQs volumes. There are many attributes available in the industry used to separate the lithology and fluid, however some of the methods are not suitable for the application to the basins in Malaysia.

  19. Reducing Uncertainty in Fatigue Life Limits of Turbine Engine Alloys

    Science.gov (United States)

    2014-03-01

    10-8 component failures/engine flight hour. This metric underscores the essential role of safety in a design process that simultaneously strives to...This metric underscores the essential role of safety in a design process that simultaneously strives to achieve perfor- mance, efficiency, reliability...resonance at 20 kHz. At the highest stres - ses, surface-connected a particles typically served as the primary sites for crack initiation (e.g., Fig. 4

  20. Reducing uncertainty in high-resolution sea ice models.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Kara J.; Bochev, Pavel Blagoveston

    2013-07-01

    Arctic sea ice is an important component of the global climate system, reflecting a significant amount of solar radiation, insulating the ocean from the atmosphere and influencing ocean circulation by modifying the salinity of the upper ocean. The thickness and extent of Arctic sea ice have shown a significant decline in recent decades with implications for global climate as well as regional geopolitics. Increasing interest in exploration as well as climate feedback effects make predictive mathematical modeling of sea ice a task of tremendous practical import. Satellite data obtained over the last few decades have provided a wealth of information on sea ice motion and deformation. The data clearly show that ice deformation is focused along narrow linear features and this type of deformation is not well-represented in existing models. To improve sea ice dynamics we have incorporated an anisotropic rheology into the Los Alamos National Laboratory global sea ice model, CICE. Sensitivity analyses were performed using the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) to determine the impact of material parameters on sea ice response functions. Two material strength parameters that exhibited the most significant impact on responses were further analyzed to evaluate their influence on quantitative comparisons between model output and data. The sensitivity analysis along with ten year model runs indicate that while the anisotropic rheology provides some benefit in velocity predictions, additional improvements are required to make this material model a viable alternative for global sea ice simulations.

  1. Uncertainty evaluation methods for waste package performance assessment

    International Nuclear Information System (INIS)

    Wu, Y.T.; Nair, P.K.; Journel, A.G.; Abramson, L.R.

    1991-01-01

    This report identifies and investigates methodologies to deal with uncertainties in assessing high-level nuclear waste package performance. Four uncertainty evaluation methods (probability-distribution approach, bounding approach, expert judgment, and sensitivity analysis) are suggested as the elements of a methodology that, without either diminishing or enhancing the input uncertainties, can evaluate performance uncertainty. Such a methodology can also help identify critical inputs as a guide to reducing uncertainty so as to provide reasonable assurance that the risk objectives are met. This report examines the current qualitative waste containment regulation and shows how, in conjunction with the identified uncertainty evaluation methodology, a framework for a quantitative probability-based rule can be developed that takes account of the uncertainties. Current US Nuclear Regulatory Commission (NRC) regulation requires that the waste packages provide ''substantially complete containment'' (SCC) during the containment period. The term ''SCC'' is ambiguous and subject to interpretation. This report, together with an accompanying report that describes the technical considerations that must be addressed to satisfy high-level waste containment requirements, provides a basis for a third report to develop recommendations for regulatory uncertainty reduction in the ''containment''requirement of 10 CFR Part 60. 25 refs., 3 figs., 2 tabs

  2. One Approach to the Fire PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2002-01-01

    Experienced practical events and findings from the number of fire probabilistic safety assessment (PSA) studies show that fire has high relative importance for nuclear power plant safety. Fire PSA is a very challenging phenomenon and a number of issues are still in the area of research and development. This has a major impact on the conservatism of fire PSA findings. One way to reduce the level of conservatism is to conduct uncertainty analysis. At the top-level, uncertainty of the fire PSA can be separated in to three segments. The first segment is related to fire initiating events frequencies. The second uncertainty segment is connected to the uncertainty of fire damage. Finally, there is uncertainty related to the PSA model, which propagates this fire-initiated damage to the core damage or other analyzed risk. This paper discusses all three segments of uncertainty. Some recent experience with fire PSA study uncertainty analysis, usage of fire analysis code COMPBRN IIIe, and uncertainty evaluation importance to the final result is presented.(author)

  3. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  4. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  5. Model uncertainty in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  6. Transposition of the great arteries

    Directory of Open Access Journals (Sweden)

    Castela Eduardo

    2008-10-01

    Full Text Available Abstract Transposition of the great arteries (TGA, also referred to as complete transposition, is a congenital cardiac malformation characterised by atrioventricular concordance and ventriculoarterial (VA discordance. The incidence is estimated at 1 in 3,500–5,000 live births, with a male-to-female ratio 1.5 to 3.2:1. In 50% of cases, the VA discordance is an isolated finding. In 10% of cases, TGA is associated with noncardiac malformations. The association with other cardiac malformations such as ventricular septal defect (VSD and left ventricular outflow tract obstruction is frequent and dictates timing and clinical presentation, which consists of cyanosis with or without congestive heart failure. The onset and severity depend on anatomical and functional variants that influence the degree of mixing between the two circulations. If no obstructive lesions are present and there is a large VSD, cyanosis may go undetected and only be perceived during episodes of crying or agitation. In these cases, signs of congestive heart failure prevail. The exact aetiology remains unknown. Some associated risk factors (gestational diabetes mellitus, maternal exposure to rodenticides and herbicides, maternal use of antiepileptic drugs have been postulated. Mutations in growth differentiation factor-1 gene, the thyroid hormone receptor-associated protein-2 gene and the gene encoding the cryptic protein have been shown implicated in discordant VA connections, but they explain only a small minority of TGA cases. The diagnosis is confirmed by echocardiography, which also provides the morphological details required for future surgical management. Prenatal diagnosis by foetal echocardiography is possible and desirable, as it may improve the early neonatal management and reduce morbidity and mortality. Differential diagnosis includes other causes of central neonatal cyanosis. Palliative treatment with prostaglandin E1 and balloon atrial septostomy are usually

  7. Risk Management and Uncertainty in Infrastructure Projects

    DEFF Research Database (Denmark)

    Harty, Chris; Neerup Themsen, Tim; Tryggestad, Kjell

    2014-01-01

    The assumption that large complex projects should be managed in order to reduce uncertainty and increase predictability is not new. What is relatively new, however, is that uncertainty reduction can and should be obtained through formal risk management approaches. We question both assumptions...... by addressing a more fundamental question about the role of knowledge in current risk management practices. Inquiries into the predominant approaches to risk management in large infrastructure and construction projects reveal their assumptions about knowledge and we discuss the ramifications these have...... for project and construction management. Our argument and claim is that predominant risk management approaches tends to reinforce conventional ideas of project control whilst undermining other notions of value and relevance of built assets and project management process. These approaches fail to consider...

  8. Planck 2013 results. III. LFI systematic uncertainties

    CERN Document Server

    Aghanim, N; Arnaud, M; Ashdown, M; Atrio-Barandela, F; Aumont, J; Baccigalupi, C; Banday, A J; Barreiro, R B; Battaner, E; Benabed, K; Benoît, A; Benoit-Lévy, A; Bernard, J -P; Bersanelli, M; Bielewicz, P; Bobin, J; Bock, J J; Bonaldi, A; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Bridges, M; Bucher, M; Burigana, C; Butler, R C; Cardoso, J -F; Catalano, A; Chamballu, A; Chiang, L -Y; Christensen, P R; Church, S; Colombi, S; Colombo, L P L; Crill, B P; Cruz, M; Curto, A; Cuttaia, F; Danese, L; Davies, R D; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Dick, J; Dickinson, C; Diego, J M; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dupac, X; Efstathiou, G; Enßlin, T A; Eriksen, H K; Finelli, F; Forni, O; Frailis, M; Franceschi, E; Gaier, T C; Galeotta, S; Ganga, K; Giard, M; Giraud-Héraud, Y; Gjerløw, E; González-Nuevo, J; Górski, K M; Gratton, S; Gregorio, A; Gruppuso, A; Hansen, F K; Hanson, D; Harrison, D; Henrot-Versillé, S; Hernández-Monteagudo, C; Herranz, D; Hildebrandt, S R; Hivon, E; Hobson, M; Holmes, W A; Hornstrup, A; Hovest, W; Huffenberger, K M; Jaffe, T R; Jaffe, A H; Jewell, J; Jones, W C; Juvela, M; Kangaslahti, P; Keihänen, E; Keskitalo, R; Kiiveri, K; Kisner, T S; Knoche, J; Knox, L; Kunz, M; Kurki-Suonio, H; Lagache, G; Lähteenmäki, A; Lamarre, J -M; Lasenby, A; Laureijs, R J; Lawrence, C R; Leahy, J P; Leonardi, R; Lesgourgues, J; Liguori, M; Lilje, P B; Lindholm, V; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Macías-Pérez, J F; Maino, D; Mandolesi, N; Maris, M; Marshall, D J; Martin, P G; Martínez-González, E; Masi, S; Matarrese, S; Matthai, F; Mazzotta, P; Meinhold, P R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Naselsky, P; Natoli, P; Netterfield, C B; Nørgaard-Nielsen, H U; Novikov, D; Novikov, I; O'Dwyer, I J; Osborne, S; Paci, F; Pagano, L; Paladini, R; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, D; Peel, M; Perdereau, O; Perotto, L; Perrotta, F; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Platania, P; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Poutanen, T; Pratt, G W; Prézeau, G; Prunet, S; Puget, J -L; Rachen, J P; Rebolo, R; Reinecke, M; Remazeilles, M; Ricciardi, S; Riller, T; Rocha, G; Rosset, C; Rossetti, M; Roudier, G; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Santos, D; Scott, D; Seiffert, M D; Shellard, E P S; Spencer, L D; Starck, J -L; Stolyarov, V; Stompor, R; Sureau, F; Sutton, D; Suur-Uski, A -S; Sygnet, J -F; Tauber, J A; Tavagnacco, D; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Tuovinen, J; Türler, M; Umana, G; Valenziano, L; Valiviita, J; Van Tent, B; Varis, J; Vielva, P; Villa, F; Vittorio, N; Wade, L A; Wandelt, B D; Watson, R; Wilkinson, A; Yvon, D; Zacchei, A; Zonca, A

    2014-01-01

    We present the current estimate of instrumental and systematic effect uncertainties for the Planck-Low Frequency Instrument relevant to the first release of the Planck cosmological results. We give an overview of the main effects and of the tools and methods applied to assess residuals in maps and power spectra. We also present an overall budget of known systematic effect uncertainties, which are dominated sidelobe straylight pick-up and imperfect calibration. However, even these two effects are at least two orders of magnitude weaker than the cosmic microwave background (CMB) fluctuations as measured in terms of the angular temperature power spectrum. A residual signal above the noise level is present in the multipole range $\\ell<20$, most notably at 30 GHz, and is likely caused by residual Galactic straylight contamination. Current analysis aims to further reduce the level of spurious signals in the data and to improve the systematic effects modelling, in particular with respect to straylight and calibra...

  9. Probability and uncertainty in nuclear safety decisions

    International Nuclear Information System (INIS)

    Pate-Cornell, M.E.

    1986-01-01

    In this paper, we examine some problems posed by the use of probabilities in Nuclear Safety decisions. We discuss some of the theoretical difficulties due to the collective nature of regulatory decisions, and, in particular, the calibration and the aggregation of risk information (e.g., experts opinions). We argue that, if one chooses numerical safety goals as a regulatory basis, one can reduce the constraints to an individual safety goal and a cost-benefit criterion. We show the relevance of risk uncertainties in this kind of regulatory framework. We conclude that, whereas expected values of future failure frequencies are adequate to show compliance with economic constraints, the use of a fractile (e.g., 95%) to be specified by the regulatory agency is justified to treat hazard uncertainties for the individual safety goal. (orig.)

  10. Siting uncertainties and challenges in Appalachia

    International Nuclear Information System (INIS)

    Vincenti, J.R.

    1992-01-01

    The purpose of this paper is to discuss the uncertainties and challenges facing users of radioactive isotopes and the generators of low-level radioactive waste (LLRW) in the United States. This paper focuses specially on those user/generators in Delaware, Maryland, Pennsylvania, and West Virginia, which make up the Appalachian States Compact. These uncertainties are based on legal and political actions that have thwarted siting and licensing of LLRW throughout the United States. The challenges facing users of radioactive isotopes are numerous. They stem from the need to reduce or minimize waste volume and to treat or eliminate the generation of waste, especially mixed waste. The basic problem, after the attention to waste management, is that some users are still left with a waste that must be disposed of in a regional or national site for long-term storage and monitoring. This problem will not go away

  11. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  12. Evaluate prevailing climate change on Great Lakes water levels

    International Nuclear Information System (INIS)

    Islam, M.

    2009-01-01

    'Full text:'In this paper, results of a comprehensive water mass balance modeling for the Great Lakes against prevailing and different anticipated climate change scenarios would be presented. Modeling is done in evaluating the changes in the lake storages and then changes in the lake's water level considering present condition, uncertainty and variability of climate and hydrologic conditions in the future. Inflow-outflow and consequent changes in the five Great Lake's storages are simulated for the last 30 years and then projected to evaluate the changes in the lake storages for the next 50 years. From the predicted changes in the lake storage data, water level is calculated using mass to linear conversion equation. Modeling and analysis results are expected to be helpful in understanding the possible impacts of the climate change on the Great Lakes water environment and preparing strategic plan for the sustainable management of lake's water resources. From the recent past, it is observed that there is a depleting trend in the lakes water level and hence there is a potential threat to lake's water environment and uncertainty of the availability of quality and quantity of water for the future generations, especially against prevailing and anticipated climate changes. For this reason, it is an urgent issue of understanding and quantifying the potential impacts of climate change on the Great Lake's water levels and storages. (author)

  13. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  14. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  15. Uncertainty in simulating wheat yields under climate change

    DEFF Research Database (Denmark)

    Asseng, A; Ewert, F; Rosenzweig, C

    2013-01-01

    of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models...... than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi...

  16. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vicari Kristin J

    2012-04-01

    the TE model predictions. This analysis highlights the primary measurements that merit further development to reduce the uncertainty associated with their use in TE models. While we develop and apply this mathematical framework to a specific biorefinery scenario here, this analysis can be readily adapted to other types of biorefining processes and provides a general framework for propagating uncertainty due to analytical measurements through a TE model.

  17. Network optimization including gas lift and network parameters under subsurface uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Schulze-Riegert, R.; Baffoe, J.; Pajonk, O. [SPT Group GmbH, Hamburg (Germany); Badalov, H.; Huseynov, S. [Technische Univ. Clausthal, Clausthal-Zellerfeld (Germany). ITE; Trick, M. [SPT Group, Calgary, AB (Canada)

    2013-08-01

    Optimization of oil and gas field production systems poses a great challenge to field development due to complex and multiple interactions between various operational design parameters and subsurface uncertainties. Conventional analytical methods are capable of finding local optima based on single deterministic models. They are less applicable for efficiently generating alternative design scenarios in a multi-objective context. Practical implementations of robust optimization workflows integrate the evaluation of alternative design scenarios and multiple realizations of subsurface uncertainty descriptions. Production or economic performance indicators such as NPV (Net Present Value) are linked to a risk-weighted objective function definition to guide the optimization processes. This work focuses on an integrated workflow using a reservoir-network simulator coupled to an optimization framework. The work will investigate the impact of design parameters while considering the physics of the reservoir, wells, and surface facilities. Subsurface uncertainties are described by well parameters such as inflow performance. Experimental design methods are used to investigate parameter sensitivities and interactions. Optimization methods are used to find optimal design parameter combinations which improve key performance indicators of the production network system. The proposed workflow will be applied to a representative oil reservoir coupled to a network which is modelled by an integrated reservoir-network simulator. Gas-lift will be included as an explicit measure to improve production. An objective function will be formulated for the net present value of the integrated system including production revenue and facility costs. Facility and gas lift design parameters are tuned to maximize NPV. Well inflow performance uncertainties are introduced with an impact on gas lift performance. Resulting variances on NPV are identified as a risk measure for the optimized system design. A

  18. Hospital Capital Investment During the Great Recession.

    Science.gov (United States)

    Choi, Sung

    2017-01-01

    Hospital capital investment is important for acquiring and maintaining technology and equipment needed to provide health care. Reduction in capital investment by a hospital has negative implications for patient outcomes. Most hospitals rely on debt and internal cash flow to fund capital investment. The great recession may have made it difficult for hospitals to borrow, thus reducing their capital investment. I investigated the impact of the great recession on capital investment made by California hospitals. Modeling how hospital capital investment may have been liquidity constrained during the recession is a novel contribution to the literature. I estimated the model with California Office of Statewide Health Planning and Development data and system generalized method of moments. Findings suggest that not-for-profit and public hospitals were liquidity constrained during the recession. Comparing the changes in hospital capital investment between 2006 and 2009 showed that hospitals used cash flow to increase capital investment by $2.45 million, other things equal.

  19. Hospital Capital Investment During the Great Recession

    Science.gov (United States)

    Choi, Sung

    2017-01-01

    Hospital capital investment is important for acquiring and maintaining technology and equipment needed to provide health care. Reduction in capital investment by a hospital has negative implications for patient outcomes. Most hospitals rely on debt and internal cash flow to fund capital investment. The great recession may have made it difficult for hospitals to borrow, thus reducing their capital investment. I investigated the impact of the great recession on capital investment made by California hospitals. Modeling how hospital capital investment may have been liquidity constrained during the recession is a novel contribution to the literature. I estimated the model with California Office of Statewide Health Planning and Development data and system generalized method of moments. Findings suggest that not-for-profit and public hospitals were liquidity constrained during the recession. Comparing the changes in hospital capital investment between 2006 and 2009 showed that hospitals used cash flow to increase capital investment by $2.45 million, other things equal. PMID:28617202

  20. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  1. Relational uncertainty in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2017-01-01

    in service dyads and how they resolve it through suitable organisational responses to increase the level of service quality. Design/methodology/approach: We apply the overall logic of Organisational Information-Processing Theory (OIPT) and present empirical insights from two industrial case studies collected...... the relational uncertainty increased the functional quality while resolving the partner’s organisational uncertainty increased the technical quality of the delivered service. Originality: We make two contributions. First, we introduce relational uncertainty to the OM literature as the inability to predict...... and explain the actions of a partnering organisation due to a lack of knowledge about their abilities and intentions. Second, we present suitable organisational responses to relational uncertainty and their effect on service quality....

  2. Advanced LOCA code uncertainty assessment

    International Nuclear Information System (INIS)

    Wickett, A.J.; Neill, A.P.

    1990-11-01

    This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)

  3. Famous puzzles of great mathematicians

    CERN Document Server

    Petković, Miodrag S

    2009-01-01

    This entertaining book presents a collection of 180 famous mathematical puzzles and intriguing elementary problems that great mathematicians have posed, discussed, and/or solved. The selected problems do not require advanced mathematics, making this book accessible to a variety of readers. Mathematical recreations offer a rich playground for both amateur and professional mathematicians. Believing that creative stimuli and aesthetic considerations are closely related, great mathematicians from ancient times to the present have always taken an interest in puzzles and diversions. The goal of this

  4. The role of uncertainty in climate change adaptation strategies — A Danish water management example

    DEFF Research Database (Denmark)

    Refsgaard, J.C.; Arnbjerg-Nielsen, Karsten; Drews, Martin

    2013-01-01

    We propose a generic framework to characterize climate change adaptation uncertainty according to three dimensions: level, source and nature. Our framework is different, and in this respect more comprehensive, than the present UN Intergovernmental Panel on Climate Change (IPCC) approach and could...... be used to address concerns that the IPCC approach is oversimplified. We have studied the role of uncertainty in climate change adaptation planning using examples from four Danish water related sectors. The dominating sources of uncertainty differ greatly among issues; most uncertainties on impacts...

  5. Guidance for treatment of variability and uncertainty in ecological risk assessments of contaminated sites

    International Nuclear Information System (INIS)

    1998-06-01

    Uncertainty is a seemingly simple concept that has caused great confusion and conflict in the field of risk assessment. This report offers guidance for the analysis and presentation of variability and uncertainty in ecological risk assessments, an important issue in the remedial investigation and feasibility study processes. This report discusses concepts of probability in terms of variance and uncertainty, describes how these concepts differ in ecological risk assessment from human health risk assessment, and describes probabilistic aspects of specific ecological risk assessment techniques. The report ends with 17 points to consider in performing an uncertainty analysis for an ecological risk assessment of a contaminated site

  6. How to live with uncertainties?

    International Nuclear Information System (INIS)

    Michel, R.

    2012-01-01

    In a short introduction, the problem of uncertainty as a general consequence of incomplete information as well as the approach to quantify uncertainty in metrology are addressed. A little history of the more than 30 years of the working group AK SIGMA is followed by an appraisal of its up-to-now achievements. Then, the potential future of the AK SIGMA is discussed based on its actual tasks and on open scientific questions and future topics. (orig.)

  7. Some remarks on modeling uncertainties

    International Nuclear Information System (INIS)

    Ronen, Y.

    1983-01-01

    Several topics related to the question of modeling uncertainties are considered. The first topic is related to the use of the generalized bias operator method for modeling uncertainties. The method is expanded to a more general form of operators. The generalized bias operator is also used in the inverse problem and applied to determine the anisotropic scattering law. The last topic discussed is related to the question of the limit to accuracy and how to establish its value. (orig.) [de

  8. Uncertainty analysis in safety assessment

    International Nuclear Information System (INIS)

    Lemos, Francisco Luiz de; Sullivan, Terry

    1997-01-01

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)

  9. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  10. Optimal Taxation under Income Uncertainty

    OpenAIRE

    Xianhua Dai

    2011-01-01

    Optimal taxation under income uncertainty has been extensively developed in expected utility theory, but it is still open for inseparable utility function between income and effort. As an alternative of decision-making under uncertainty, prospect theory (Kahneman and Tversky (1979), Tversky and Kahneman (1992)) has been obtained empirical support, for example, Kahneman and Tversky (1979), and Camerer and Lowenstein (2003). It is beginning to explore optimal taxation in the context of prospect...

  11. New Perspectives on Policy Uncertainty

    OpenAIRE

    Hlatshwayo, Sandile

    2017-01-01

    In recent years, the ubiquitous and intensifying nature of economic policy uncertainty has made it a popular explanation for weak economic performance in developed and developing markets alike. The primary channel for this effect is decreased and delayed investment as firms adopt a ``wait and see'' approach to irreversible investments (Bernanke, 1983; Dixit and Pindyck, 1994). Deep empirical examination of policy uncertainty's impact is rare because of the difficulty associated in measuring i...

  12. Do Wild Great Tits Avoid Exposure to Light at Night?

    Directory of Open Access Journals (Sweden)

    Maaike de Jong

    Full Text Available Studies of wild populations have provided important insights into the effects of artificial light at night on organisms, populations and ecosystems. However, in most studies the exact amount of light at night individuals are exposed to remains unknown. Individuals can potentially control their nighttime light exposure by seeking dark spots within illuminated areas. This uncertainty makes it difficult to attribute effects to a direct effect of light at night, or to indirect effects, e.g., via an effect of light at night on food availability. In this study, we aim to quantify the nocturnal light exposure of wild birds in a previously dark forest-edge habitat, experimentally illuminated with three different colors of street lighting, in comparison to a dark control. During two consecutive breeding seasons, we deployed male great tits (Parus major with a light logger measuring light intensity every five minutes over a 24h period. We found that three males from pairs breeding in brightly illuminated nest boxes close to green and red lamp posts, were not exposed to more artificial light at night than males from pairs breeding further away. This suggests, based on our limited sample size, that these males could have been avoiding light at night by choosing a roosting place with a reduced light intensity. Therefore, effects of light at night previously reported for this species in our experimental set-up might be indirect. In contrast to urban areas where light is omnipresent, bird species in non-urban areas may evade exposure to nocturnal artificial light, thereby avoiding direct consequences of light at night.

  13. Confronting the Uncertainty in Aerosol Forcing Using Comprehensive Observational Data

    Science.gov (United States)

    Johnson, J. S.; Regayre, L. A.; Yoshioka, M.; Pringle, K.; Sexton, D.; Lee, L.; Carslaw, K. S.

    2017-12-01

    The effect of aerosols on cloud droplet concentrations and radiative properties is the largest uncertainty in the overall radiative forcing of climate over the industrial period. In this study, we take advantage of a large perturbed parameter ensemble of simulations from the UK Met Office HadGEM-UKCA model (the aerosol component of the UK Earth System Model) to comprehensively sample uncertainty in aerosol forcing. Uncertain aerosol and atmospheric parameters cause substantial aerosol forcing uncertainty in climatically important regions. As the aerosol radiative forcing itself is unobservable, we investigate the potential for observations of aerosol and radiative properties to act as constraints on the large forcing uncertainty. We test how eight different theoretically perfect aerosol and radiation observations can constrain the forcing uncertainty over Europe. We find that the achievable constraint is weak unless many diverse observations are used simultaneously. This is due to the complex relationships between model output responses and the multiple interacting parameter uncertainties: compensating model errors mean there are many ways to produce the same model output (known as model equifinality) which impacts on the achievable constraint. However, using all eight observable quantities together we show that the aerosol forcing uncertainty can potentially be reduced by around 50%. This reduction occurs as we reduce a large sample of model variants (over 1 million) that cover the full parametric uncertainty to around 1% that are observationally plausible.Constraining the forcing uncertainty using real observations is a more complex undertaking, in which we must account for multiple further uncertainties including measurement uncertainties, structural model uncertainties and the model discrepancy from reality. Here, we make a first attempt to determine the true potential constraint on the forcing uncertainty from our model that is achievable using a comprehensive

  14. Pharmacological Fingerprints of Contextual Uncertainty.

    Directory of Open Access Journals (Sweden)

    Louise Marshall

    2016-11-01

    Full Text Available Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses.

  15. The EURACOS activation experiments: preliminary uncertainty analysis

    International Nuclear Information System (INIS)

    Yeivin, Y.

    1982-01-01

    A sequence of counting rates of an irradiated sulphur pellet, r(tsub(i)), measured at different times after the end of the irradiation, are fitted to r(t)=Aexp(-lambda t)+B. A standard adjustment procedure is applied to determine the parameters A and B, their standard deviations and correlation, and chi square. It is demonstrated that if the counting-rate uncertainties are entirely due to the counting statistics, the experimental data are totally inconsistent with the ''theoretical'' model. However, assuming an additional systematic error of approximalety 1%, and eliminating a few ''bad'' data, produces a data set quite consistent with the model. The dependence of chi square on the assumed systematic error and the data elimination procedure are discussed in great detail. A review of the adjustment procedure is appended to the report

  16. Impact of dose-distribution uncertainties on rectal ntcp modeling I: Uncertainty estimates

    International Nuclear Information System (INIS)

    Fenwick, John D.; Nahum, Alan E.

    2001-01-01

    A trial of nonescalated conformal versus conventional radiotherapy treatment of prostate cancer has been carried out at the Royal Marsden NHS Trust (RMH) and Institute of Cancer Research (ICR), demonstrating a significant reduction in the rate of rectal bleeding reported for patients treated using the conformal technique. The relationship between planned rectal dose-distributions and incidences of bleeding has been analyzed, showing that the rate of bleeding falls significantly as the extent of the rectal wall receiving a planned dose-level of more than 57 Gy is reduced. Dose-distributions delivered to the rectal wall over the course of radiotherapy treatment inevitably differ from planned distributions, due to sources of uncertainty such as patient setup error, rectal wall movement and variation in the absolute rectal wall surface area. In this paper estimates of the differences between planned and treated rectal dose-distribution parameters are obtained for the RMH/ICR nonescalated conformal technique, working from a distribution of setup errors observed during the RMH/ICR trial, movement data supplied by Lebesque and colleagues derived from repeat CT scans, and estimates of rectal circumference variations extracted from the literature. Setup errors and wall movement are found to cause only limited systematic differences between mean treated and planned rectal dose-distribution parameter values, but introduce considerable uncertainties into the treated values of some dose-distribution parameters: setup errors lead to 22% and 9% relative uncertainties in the highly dosed fraction of the rectal wall and the wall average dose, respectively, with wall movement leading to 21% and 9% relative uncertainties. Estimates obtained from the literature of the uncertainty in the absolute surface area of the distensible rectal wall are of the order of 13%-18%. In a subsequent paper the impact of these uncertainties on analyses of the relationship between incidences of bleeding

  17. Water supply infrastructure planning under multiple uncertainties: A differentiated approach

    Science.gov (United States)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply. Supply uncertainty arises from short-term climate variability and long-term climate change as well as uncertainty in groundwater availability. Social and economic uncertainties - such as sectoral competition for water, food and energy security, urbanization, and environmental protection - compound physical uncertainty. Further, the varying risk aversion of stakeholders and water managers makes it difficult to assess the necessity of expensive infrastructure investments to reduce risk. We categorize these uncertainties on two dimensions: whether they can be updated over time by collecting additional information, and whether the uncertainties can be described probabilistically or are "deep" uncertainties whose likelihood is unknown. Based on this, we apply a decision framework that combines simulation for probabilistic uncertainty, scenario analysis for deep uncertainty, and multi-stage decision analysis for uncertainties that are reduced over time with additional information. In light of these uncertainties and the investment costs of large infrastructure, we propose the assessment of staged, modular infrastructure and information updating as a hedge against risk. We apply this framework to cases in Melbourne, Australia and Riyadh, Saudi Arabia. Melbourne is a surface water system facing uncertain population growth and variable rainfall and runoff. A severe drought from 1997 to 2009 prompted investment in a 150 MCM/y reverse osmosis desalination plan with a capital cost of 3.5 billion. Our analysis shows that flexible design in which a smaller portion of capacity is developed initially with the option to add modular capacity in the future can mitigate uncertainty and reduce the expected lifetime costs by up to 1 billion. In Riyadh, urban water use relies on fossil groundwater aquifers and

  18. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  19. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  20. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  1. Making a Great First Impression

    Science.gov (United States)

    Evenson, Renee

    2007-01-01

    Managers and business owners often base hiring decisions on first impressions. That is why it is so important to teach students to make a great first impression--before they go on that first job interview. Managers do not have unrealistic expectations, they just want to hire people who they believe can develop into valuable employees. A nice…

  2. Great Basin paleoenvironmental studies project

    International Nuclear Information System (INIS)

    1993-01-01

    Project goals, project tasks, progress on tasks, and problems encountered are described and discussed for each of the studies that make up the Great Basin Paleoenvironmental Studies Project for Yucca Mountain. These studies are: Paleobotany, Paleofauna, Geomorphology, and Transportation. Budget summaries are also given for each of the studies and for the overall project

  3. The Great Books and Economics.

    Science.gov (United States)

    Hartley, James E.

    2001-01-01

    Describes an introductory economics course in which all of the reading material is drawn from the Great Books of Western Civilization. Explains the rationale and mechanics of the course. Includes an annotated course syllabus that details how the reading material relates to the lecture material. (RLH)

  4. Great tit hatchling sex ratios

    NARCIS (Netherlands)

    Lessells, C.M.; Mateman, A.C.; Visser, J.

    1996-01-01

    The sex of Great Tit Parus major nestlings was determined using PCR RAPDs. Because this technique requires minute amounts of DNA, chicks could be sampled soon (0-2d) after hatching, before any nestling mortality occurred. The proportion of males among 752 chicks hatching in 102 broods (98.9% of

  5. The Great Gatsby. [Lesson Plan].

    Science.gov (United States)

    Zelasko, Ken

    Based on F. Scott Fitzgerald's novel "The Great Gatsby," this lesson plan presents activities designed to help students understand that adapting part of a novel into a dramatic reading makes students more intimate with the author's intentions and craft; and that a part of a novel may lend itself to various oral interpretations. The main activity…

  6. Great Basin wildlife disease concerns

    Science.gov (United States)

    Russ Mason

    2008-01-01

    In the Great Basin, wildlife diseases have always represented a significant challenge to wildlife managers, agricultural production, and human health and safety. One of the first priorities of the U.S. Department of Agriculture, Division of Fish and Wildlife Services was Congressionally directed action to eradicate vectors for zoonotic disease, particularly rabies, in...

  7. Uncertainties in risk assessment at USDOE facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  8. Uncertainties in risk assessment at USDOE facilities

    International Nuclear Information System (INIS)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms open-quote risk assessment close-quote and open-quote risk management close-quote are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of open-quotes... the most significant data and uncertainties...close quotes in an assessment. Significant data and uncertainties are open-quotes...those that define and explain the main risk conclusionsclose quotes. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation

  9. Effect of uncertainty parameters on graphene sheets Young's modulus prediction

    International Nuclear Information System (INIS)

    Sahlaoui, Habib; Sidhom Habib; Guedri, Mohamed

    2013-01-01

    Software based on molecular structural mechanics approach (MSMA) and using finite element method (FEM) has been developed to predict the Young's modulus of graphene sheets. Obtained results have been compared to results available in the literature and good agreement has been shown when the same values of uncertainty parameters are used. A sensibility of the models to their uncertainty parameters has been investigated using a stochastic finite element method (SFEM). The different values of the used uncertainty parameters, such as molecular mechanics force field constants k_r and k_θ, thickness (t) of a graphene sheet and length ( L_B) of a carbon carbon bonds, have been collected from the literature. Strong sensibilities of 91% to the thickness and of 21% to the stretching force (k_r) have been shown. The results justify the great difference between Young's modulus predicted values of the graphene sheets and their large disagreement with experimental results.

  10. Radiotherapy for breast cancer: respiratory and set-up uncertainties

    International Nuclear Information System (INIS)

    Saliou, M.G.; Giraud, P.; Simon, L.; Fournier-Bidoz, N.; Fourquet, A.; Dendale, R.; Rosenwald, J.C.; Cosset, J.M.

    2005-01-01

    Adjuvant Radiotherapy has been shown to significantly reduce locoregional recurrence but this advantage is associated with increased cardiovascular and pulmonary morbidities. All uncertainties inherent to conformal radiation therapy must be identified in order to increase the precision of treatment; misestimation of these uncertainties increases the potential risk of geometrical misses with, as a consequence, under-dosage of the tumor and/or overdosage of healthy tissues. Geometric uncertainties due to respiratory movements or set-up errors are well known. Two strategies have been proposed to limit their effect: quantification of these uncertainties, which are then taken into account in the final calculation of safety margins and/or reduction of respiratory and set-up uncertainties by an efficient immobilization or gating systems. Measured on portal films with two tangential fields. CLD (central lung distance), defined as the distance between the deep field edge and the interior chest wall at the central axis, seems to be the best predictor of set-up uncertainties. Using CLD, estimated mean set-up errors from the literature are 3.8 and 3.2 mm for the systematic and random errors respectively. These depend partly on the type of immobilization device and could be reduced by the use of portal imaging systems. Furthermore, breast is mobile during respiration with motion amplitude as high as 0.8 to 10 mm in the anteroposterior direction. Respiratory gating techniques, currently on evaluation, have the potential to reduce effect of these movements. Each radiotherapy department should perform its own assessments and determine the geometric uncertainties with respect of the equipment used and its particular treatment practices. This paper is a review of the main geometric uncertainties in breast treatment, due to respiration and set-up, and solutions proposed to limit their impact. (author)

  11. Analogy as a strategy for supporting complex problem solving under uncertainty.

    Science.gov (United States)

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  12. Uncertainty of climate change impacts and consequences on the prediction of future hydrological trends

    International Nuclear Information System (INIS)

    Minville, M.; Brissette, F.; Leconte, R.

    2008-01-01

    In the future, water is very likely to be the resource that will be most severely affected by climate change. It has been shown that small perturbations in precipitation frequency and/or quantity can result in significant impacts on the mean annual discharge. Moreover, modest changes in natural inflows result in larger changes in reservoir storage. There is however great uncertainty linked to changes in both the magnitude and direction of future hydrological trends. This presentation discusses the various sources of this uncertainty and their potential impact on the prediction of future hydrological trends. A companion paper will look at adaptation potential, taking into account some of the sources of uncertainty discussed in this presentation. Uncertainty is separated into two main components: climatic uncertainty and 'model and methods' uncertainty. Climatic uncertainty is linked to uncertainty in future greenhouse gas emission scenarios (GHGES) and to general circulation models (GCMs), whose representation of topography and climate processes is imperfect, in large part due to computational limitations. The uncertainty linked to natural variability (which may or may not increase) is also part of the climatic uncertainty. 'Model and methods' uncertainty regroups the uncertainty linked to the different approaches and models needed to transform climate data so that they can be used by hydrological models (such as downscaling methods) and the uncertainty of the models themselves and of their use in a changed climate. The impacts of the various sources of uncertainty on the hydrology of a watershed are demonstrated on the Peribonka River basin (Quebec, Canada). The results indicate that all sources of uncertainty can be important and outline the importance of taking these sources into account for any impact and adaptation studies. Recommendations are outlined for such studies. (author)

  13. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Science.gov (United States)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if other factors are involved. With added experience, uncertainty could be expected to diminish, but perhaps more influential are things like physician confidence, belief in the veracity of what is published, and even one's religious beliefs. In addition, it is plausible that the kind of practice a physician works in can affect the experience of uncertainty. Practicing physicians may not be immediately aware of these effects on how uncertainty is experienced in their clinical decision-making. We asked: (1) Does uncertainty and overconfidence bias decrease with years of practice? (2) What sociodemographic factors are independently associated with less recognition of uncertainty, in particular belief in God or other deity or deities, and how is atheism associated with recognition of uncertainty? (3) Do confidence bias (confidence that one's skill is greater than it actually is), degree of trust in the orthopaedic evidence, and degree of statistical sophistication correlate independently with recognition of uncertainty? We created a survey to establish an overall recognition of uncertainty score (four questions), trust in the orthopaedic evidence base (four questions), confidence bias (three questions), and statistical understanding (six questions). Seven hundred six members of the Science of Variation Group, a collaboration that aims to study variation in the definition and treatment of human illness, were approached to complete our survey. This group represents mainly orthopaedic surgeons specializing in trauma or hand and wrist surgery, practicing in Europe and North America, of whom the majority is involved in teaching. Approximately half of the group has more than 10 years

  14. Comments on Uncertainty in Groundwater Governance in the Volcanic Canary Islands, Spain

    OpenAIRE

    Custodio, Emilio; Cabrera, María; Poncela, Roberto; Cruz-Fuentes, Tatiana; Naranjo, Gema; Miguel, Luis de

    2015-01-01

    The uncertainty associated with natural magnitudes and processes is conspicuous in water resources and groundwater evaluation. This uncertainty has an essential component and a part that can be reduced to some extent by increasing knowledge, improving monitoring coverage, continuous elaboration of data and accuracy and addressing the related economic and social aspects involved. Reducing uncertainty has a cost that may not be justified by the improvement that is obtainable, but that has to be...

  15. An integrated perspective on the Permian-Triassic "Great Dying"

    Science.gov (United States)

    Algeo, T. J.

    2017-12-01

    The 252-Ma end-Permian mass extinction (EPME), marked by the loss of 90% of marine invertebrate species, was the largest biocrisis in Earth history. Intensive study of this "Great Dying" has led to major insights and a broad consensus regarding many aspects of this event. The ultimate trigger is regarded as eruption of the Siberian Traps Large Igneous Province (STLIP), which released large quantities of greenhouse gases (CO2 and CH4) and sulfate aerosols, triggering a catastrophic global warming of 10°C and acidification of both land surfaces and the surface ocean. On land, a massive die-off of vegetation led to a transient episode of rapid soil erosion and a longer-term increase in weathering rates linked to elevated temperatures. In the ocean, widespread anoxia developed concurrently with the EPME, triggered by ocean-surface warming that reduced dissolved oxygen solubility in seawater and that intensified vertical stratification. Expanded anoxia led to massive burial of organic matter and reduced sulfur, although the evidence for this is indirect (C, U and S isotopes); few organic-rich deposits of Early Triassic age have been found, suggesting that organic sedimentation occurred mainly on continental slopes or in the deep ocean. Other aspects of the end-Permian crisis remain under debate. For example, there is no consensus regarding changes in marine productivity levels in the aftermath of the EPME, which would have been stimulated by enhanced subaerial weathering but depressed by reduced overturning circulation-the evidence to date may favor localized positive and negative changes in productivity. Also under scrutiny is evidence for volcanic eruptions and environmental perturbations during the 100 kyr prior to the EPME, which are likely to have occurred but remain poorly dated and quantified. The greatest uncertainty, however, may surround the nature of the proximate kill mechanism(s) during the EPME. Many hypotheses have been advanced including mechanisms

  16. A sequential factorial analysis approach to characterize the effects of uncertainties for supporting air quality management

    Science.gov (United States)

    Wang, S.; Huang, G. H.; Veawab, A.

    2013-03-01

    This study proposes a sequential factorial analysis (SFA) approach for supporting regional air quality management under uncertainty. SFA is capable not only of examining the interactive effects of input parameters, but also of analyzing the effects of constraints. When there are too many factors involved in practical applications, SFA has the advantage of conducting a sequence of factorial analyses for characterizing the effects of factors in a systematic manner. The factor-screening strategy employed in SFA is effective in greatly reducing the computational effort. The proposed SFA approach is applied to a regional air quality management problem for demonstrating its applicability. The results indicate that the effects of factors are evaluated quantitatively, which can help decision makers identify the key factors that have significant influence on system performance and explore the valuable information that may be veiled beneath their interrelationships.

  17. Epistemic uncertainties when estimating component failure rate

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Mavko, B.; Kljenak, I.

    2000-01-01

    A method for specific estimation of a component failure rate, based on specific quantitative and qualitative data other than component failures, was developed and is described in the proposed paper. The basis of the method is the Bayesian updating procedure. A prior distribution is selected from a generic database, whereas likelihood is built using fuzzy logic theory. With the proposed method, the component failure rate estimation is based on a much larger quantity of information compared to the presently used classical methods. Consequently, epistemic uncertainties, which are caused by lack of knowledge about a component or phenomenon are reduced. (author)

  18. ESFR core optimization and uncertainty studies

    International Nuclear Information System (INIS)

    Rineiski, A.; Vezzoni, B.; Zhang, D.; Marchetti, M.; Gabrielli, F.; Maschek, W.; Chen, X.-N.; Buiron, L.; Krepel, J.; Sun, K.; Mikityuk, K.; Polidoro, F.; Rochman, D.; Koning, A.J.; DaCruz, D.F.; Tsige-Tamirat, H.; Sunderland, R.

    2015-01-01

    In the European Sodium Fast Reactor (ESFR) project supported by EURATOM in 2008-2012, a concept for a large 3600 MWth sodium-cooled fast reactor design was investigated. In particular, reference core designs with oxide and carbide fuel were optimized to improve their safety parameters. Uncertainties in these parameters were evaluated for the oxide option. Core modifications were performed first to reduce the sodium void reactivity effect. Introduction of a large sodium plenum with an absorber layer above the core and a lower axial fertile blanket improve the total sodium void effect appreciably, bringing it close to zero for a core with fresh fuel, in line with results obtained worldwide, while not influencing substantially other core physics parameters. Therefore an optimized configuration, CONF2, with a sodium plenum and a lower blanket was established first and used as a basis for further studies in view of deterioration of safety parameters during reactor operation. Further options to study were an inner fertile blanket, introduction of moderator pins, a smaller core height, special designs for pins, such as 'empty' pins, and subassemblies. These special designs were proposed to facilitate melted fuel relocation in order to avoid core re-criticality under severe accident conditions. In the paper further CONF2 modifications are compared in terms of safety and fuel balance. They may bring further improvements in safety, but their accurate assessment requires additional studies, including transient analyses. Uncertainty studies were performed by employing a so-called Total Monte-Carlo method, for which a large number of nuclear data files is produced for single isotopes and then used in Monte-Carlo calculations. The uncertainties for the criticality, sodium void and Doppler effects, effective delayed neutron fraction due to uncertainties in basic nuclear data were assessed for an ESFR core. They prove applicability of the available nuclear data for ESFR

  19. Perseveration induces dissociative uncertainty in obsessive-compulsive disorder.

    Science.gov (United States)

    Giele, Catharina L; van den Hout, Marcel A; Engelhard, Iris M; Dek, Eliane C P; Toffolo, Marieke B J; Cath, Danielle C

    2016-09-01

    Obsessive compulsive (OC)-like perseveration paradoxically increases feelings of uncertainty. We studied whether the underlying mechanism between perseveration and uncertainty is a reduced accessibility of meaning ('semantic satiation'). OCD patients (n = 24) and matched non-clinical controls (n = 24) repeated words 2 (non-perseveration) or 20 times (perseveration). They decided whether this word was related to another target word. Speed of relatedness judgments and feelings of dissociative uncertainty were measured. The effects of real-life perseveration on dissociative uncertainty were tested in a smaller subsample of the OCD group (n = 9). Speed of relatedness judgments was not affected by perseveration. However, both groups reported more dissociative uncertainty after perseveration compared to non-perseveration, which was higher in OCD patients. Patients reported more dissociative uncertainty after 'clinical' perseveration compared to non-perseveration.. Both parts of this study are limited by some methodological issues and a small sample size. Although the mechanism behind 'perseveration → uncertainty' is still unclear, results suggest that the effects of perseveration are counterproductive. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Energy Technology Data Exchange (ETDEWEB)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  1. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    International Nuclear Information System (INIS)

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  2. Management of internal communication in times of uncertainty

    International Nuclear Information System (INIS)

    Fernandez de la Gala, F.

    2014-01-01

    Garona is having a strong media coverage since 2009. The continuity process is under great controversy that has generated increased uncertainty for workers and their families, affecting motivation. Although internal communication has sought to manage its effects on the structure of the company, the rate of spread of alien information has made this complex mission. The regulatory body has been interested in its potential impact on safety culture, making a significant difference compared to other industrial sectors. (Author)

  3. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    International Nuclear Information System (INIS)

    Fredriksson, Albin; Hårdemark, Björn; Forsgren, Anders

    2015-01-01

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality

  4. Uncertainty Assessment: What Good Does it Do? (Invited)

    Science.gov (United States)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter

  5. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  6. Uncertainty in spatial planning proceedings

    Directory of Open Access Journals (Sweden)

    Aleš Mlakar

    2009-01-01

    Full Text Available Uncertainty is distinctive of spatial planning as it arises from the necessity to co-ordinate the various interests within the area, from the urgency of adopting spatial planning decisions, the complexity of the environment, physical space and society, addressing the uncertainty of the future and from the uncertainty of actually making the right decision. Response to uncertainty is a series of measures that mitigate the effects of uncertainty itself. These measures are based on two fundamental principles – standardization and optimization. The measures are related to knowledge enhancement and spatial planning comprehension, in the legal regulation of changes, in the existence of spatial planning as a means of different interests co-ordination, in the active planning and the constructive resolution of current spatial problems, in the integration of spatial planning and the environmental protection process, in the implementation of the analysis as the foundation of spatial planners activities, in the methods of thinking outside the parameters, in forming clear spatial concepts and in creating a transparent management spatial system and also in the enforcement the participatory processes.

  7. Uncertainty modeling and decision support

    International Nuclear Information System (INIS)

    Yager, Ronald R.

    2004-01-01

    We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function

  8. Southern Great Plains Safety Orientation

    Energy Technology Data Exchange (ETDEWEB)

    Schatz, John

    2014-05-01

    Welcome to the Atmospheric Radiation Measurement (ARM) Climate Research Facility (ARM) Southern Great Plains (SGP) site. This U.S. Department of Energy (DOE) site is managed by Argonne National Laboratory (ANL). It is very important that all visitors comply with all DOE and ANL safety requirements, as well as those of the Occupational Safety and Health Administration (OSHA), the National Fire Protection Association, and the U.S. Environmental Protection Agency, and with other requirements as applicable.

  9. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  10. The role of uncertainty in climate change adaptation strategies — A Danish water management example

    DEFF Research Database (Denmark)

    Refsgaard, J.C.; Arnbjerg-Nielsen, Karsten; Drews, Martin

    2013-01-01

    We propose a generic framework to characterize climate change adaptation uncertainty according to three dimensions: level, source and nature. Our framework is different, and in this respect more comprehensive, than the present UN Intergovernmental Panel on Climate Change (IPCC) approach and could...... are epistemic (reducible) by nature but uncertainties on adaptation measures are complex, with ambiguity often being added to impact uncertainties. Strategies to deal with uncertainty in climate change adaptation should reflect the nature of the uncertainty sources and how they interact with risk level...

  11. UNCERTAINTY IN THE PROCESS INTEGRATION FOR THE BIOREFINERIES DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Meilyn González Cortés

    2015-07-01

    Full Text Available This paper presents how the design approaches with high level of flexibility can reduce the additional costs of the strategies that apply overdesign factors to consider parameters with uncertainty that impact on the economic feasibility of a project. The elements with associate uncertainties and that are important in the configurations of the process integration under a biorefinery scheme are: raw material, raw material technologies of conversion, and variety of products that can be obtained. From the analysis it is obtained that in the raw materials and products with potentialities in a biorefinery scheme, there are external uncertainties such as availability, demands and prices in the market. Those external uncertainties can determine their impact on the biorefinery and also in the product prices we can find minimum and maximum limits that can be identified in intervals which should be considered for the project economic evaluation and the sensibility analysis due to varied conditions.

  12. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  13. Experiences of Uncertainty in Men With an Elevated PSA.

    Science.gov (United States)

    Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather

    2015-05-15

    A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men's reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. © The Author(s) 2015.

  14. On the uncertainty principle. V

    International Nuclear Information System (INIS)

    Halpern, O.

    1976-01-01

    The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)

  15. Davis-Besse uncertainty study

    International Nuclear Information System (INIS)

    Davis, C.B.

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results

  16. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  17. Correlated uncertainties in integral data

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1978-01-01

    The use of correlated uncertainties in calculational data is shown in cases investigated to lead to a reduction in the uncertainty of calculated quantities of importance to reactor design. It is stressed however that such reductions are likely to be important in a minority of cases of practical interest. The effect of uncertainties in detector cross-sections is considered and is seen to be, in some cases, of equal importance to that in the data used in calculations. Numerical investigations have been limited by the sparse information available on data correlations; some comparisons made of these data reveal quite large inconsistencies for both detector cross-sections and cross-section of interest for reactor calculations

  18. Uncertainty and Sensitivity Analyses Plan

    International Nuclear Information System (INIS)

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  19. The Great Recession and risk for child abuse and neglect.

    Science.gov (United States)

    Schneider, William; Waldfogel, Jane; Brooks-Gunn, Jeanne

    2017-01-01

    This paper examines the association between the Great Recession and four measures of the risk for maternal child abuse and neglect: (1) maternal physical aggression; (2) maternal psychological aggression; (3) physical neglect by mothers; and (4) supervisory/exposure neglect by mothers. It draws on rich longitudinal data from the Fragile Families and Child Wellbeing Study, a longitudinal birth cohort study of families in 20 U.S. cities (N = 3,177; 50% African American, 25% Hispanic; 22% non-Hispanic white; 3% other). The study collected information for the 9-year follow-up survey before, during, and after the Great Recession (2007-2010). Interview dates were linked to two macroeconomic measures of the Great Recession: the national Consumer Sentiment Index and the local unemployment rate. Also included are a wide range of socio-demographic controls, as well as city fixed effects and controls for prior parenting. Results indicate that the Great Recession was associated with increased risk of child abuse but decreased risk of child neglect. Households with social fathers present may have been particularly adversely affected. Results also indicate that economic uncertainty during the Great Recession, as measured by the Consumer Sentiment Index and the unemployment rate, had direct effects on the risk of abuse or neglect, which were not mediated by individual-level measures of economic hardship or poor mental health.

  20. Can you put too much on your plate? Uncertainty exposure in servitized triads

    DEFF Research Database (Denmark)

    Kreye, Melanie E.

    2017-01-01

    -national servitized triad in a European-North African set-up which was collected through 29 semi-structured interviews and secondary data. Findings: The empirical study identified the existence of the three uncertainty types and directional knock-on effects between them. Specifically, environmental uncertainty...... relational governance reduced relational uncertainty. The knock-on effects were reduced through organisational and relational responses. Originality: This paper makes two contributions. First, a structured analysis of the uncertainty exposure in servitized triads is presented which shows the existence...... of three individual uncertainty types and the knock-on effects between them. Second, organisational responses to reduce the three uncertainty types individually and the knock-on effects between them are presented....