WorldWideScience

Sample records for sources uncertainty limitations

  1. Estimating uncertainty of data limited stock assessments

    DEFF Research Database (Denmark)

    Kokkalis, Alexandros; Eikeset, Anne Maria; Thygesen, Uffe Høgsbro

    2017-01-01

    -limited. Particular emphasis is put on providing uncertainty estimates of the data-limited assessment. We assess four cod stocks in the North-East Atlantic and compare our estimates of stock status (F/Fmsy) with the official assessments. The estimated stock status of all four cod stocks followed the established stock...

  2. Comparing uncertainty aversion towards different sources

    NARCIS (Netherlands)

    Baillon, Aurélien; Liu, Ning; van Dolder, Dennie

    2017-01-01

    We propose simple behavioral definitions of comparative uncertainty aversion for a single agent towards different sources of uncertainty. Our definitions allow for the comparison of utility curvature for different sources if the agent’s choices satisfy subjective expected utility towards each

  3. Pushing the limits of NAA. Accuracy, uncertainty and detection limits

    International Nuclear Information System (INIS)

    Greenberg, R.R.

    2008-01-01

    This paper describes some highlights from the author's efforts to improve neutron activation analysis (NAA) detection limits through development and optimization of radiochemical separations, as well as to improve the overall accuracy of NAA measurements by identifying, quantifying and reducing measurement biases and uncertainties. Efforts to demonstrate the metrological basis of NAA, and to establish it as a 'Primary Method of Measurement' will be discussed. (author)

  4. Sources of uncertainty in flood inundation maps

    Science.gov (United States)

    Bales, J.D.; Wagner, C.R.

    2009-01-01

    Flood inundation maps typically have been used to depict inundated areas for floods having specific exceedance levels. The uncertainty associated with the inundation boundaries is seldom quantified, in part, because all of the sources of uncertainty are not recognized and because data available to quantify uncertainty seldom are available. Sources of uncertainty discussed in this paper include hydrologic data used for hydraulic model development and validation, topographic data, and the hydraulic model. The assumption of steady flow, which typically is made to produce inundation maps, has less of an effect on predicted inundation at lower flows than for higher flows because more time typically is required to inundate areas at high flows than at low flows. Difficulties with establishing reasonable cross sections that do not intersect and that represent water-surface slopes in tributaries contribute additional uncertainties in the hydraulic modelling. As a result, uncertainty in the flood inundation polygons simulated with a one-dimensional model increases with distance from the main channel.

  5. Radon measurements: the sources of uncertainties

    International Nuclear Information System (INIS)

    Zhukovsky, Michael; Onischenko, Alexandra; Bastrikov, Vladislav

    2008-01-01

    Full text: Radon measurements are quite complicated process and the correct estimation of uncertainties is very important. The sources of uncertainties for grab sampling, short term measurements (charcoal canisters), long term measurements (track detectors) and retrospective measurements (surface traps) are analyzed. The main sources of uncertainties for grab sampling measurements are: systematic bias of reference equipment; random Poisson and non-Poisson errors during calibration; random Poisson and non-Poisson errors during measurements. These sources are also common both for short term measurements (charcoal canisters) and long term measurements (track detectors). Usually during the calibration the high radon concentrations are used (1-5 kBq/m 3 ) and the Poisson random error rarely exceed some percents. Nevertheless the dispersion of measured values even during the calibration usually exceeds the Poisson dispersion expected on the basis of counting statistic. The origins of such non-Poisson random errors during calibration are different for different kinds of instrumental measurements. At present not all sources of non-Poisson random errors are trustworthy identified. The initial calibration accuracy of working devices rarely exceeds the value 20%. The real radon concentrations usually are in the range from some tens to some hundreds Becquerel per cubic meter and for low radon levels Poisson random error can reach up to 20%. The random non-Poisson errors and residual systematic biases are depends on the kind of measurement technique and the environmental conditions during radon measurements. For charcoal canisters there are additional sources of the measurement errors due to influence of air humidity and the variations of radon concentration during the canister exposure. The accuracy of long term measurements by track detectors will depend on the quality of chemical etching after exposure and the influence of season radon variations. The main sources of

  6. Evaluation of Sources of Uncertainties in Solar Resource Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-25

    This poster presents a high-level overview of sources of uncertainties in solar resource measurement, demonstrating the impact of various sources of uncertainties -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of data from several radiometers. The study provides insight on how to reduce the impact of some of the sources of uncertainties.

  7. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  8. Sources of uncertainty in characterizing health risks from flare emissions

    International Nuclear Information System (INIS)

    Hrudey, S.E.

    2000-01-01

    The assessment of health risks associated with gas flaring was the focus of this paper. Health risk assessments for environmental decision-making includes the evaluation of scientific data to identify hazards and to determine dose-response assessments, exposure assessments and risk characterization. Gas flaring has been the cause for public health concerns in recent years, most notably since 1996 after a published report by the Alberta Research Council. Some of the major sources of uncertainty associated with identifying hazardous contaminants in flare emissions were discussed. Methods to predict human exposures to emitted contaminants were examined along with risk characterization of predicted exposures to several identified contaminants. One of the problems is that elemental uncertainties exist regarding flare emissions which places limitations of the degree of reassurance that risk assessment can provide, but risk assessment can nevertheless offer some guidance to those responsible for flare emissions

  9. Evaluation of uncertainty and detection limits in radioactivity measurements

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, M. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain); Idoeta, R. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain)], E-mail: raquel.idoeta@ehu.es; Legarda, F. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain)

    2008-10-01

    The uncertainty associated with the assessment of the radioactive content of any sample depends on the net counting rate registered during the measuring process and on the different weighting factors needed to transform this counting rate into activity, activity per unit mass or activity concentration. This work analyses the standard uncertainties in these weighting factors as well as their contribution to the uncertainty in the activity reported for three typical determinations for environmental radioactivity measurements in the laboratory. It also studies the corresponding characteristic limits and their dependence on the standard uncertainty related to those weighting factors, offering an analysis of the effectiveness of the simplified characteristic limits as evaluated by various measuring software and laboratories.

  10. Evaluation of uncertainty and detection limits in radioactivity measurements

    International Nuclear Information System (INIS)

    Herranz, M.; Idoeta, R.; Legarda, F.

    2008-01-01

    The uncertainty associated with the assessment of the radioactive content of any sample depends on the net counting rate registered during the measuring process and on the different weighting factors needed to transform this counting rate into activity, activity per unit mass or activity concentration. This work analyses the standard uncertainties in these weighting factors as well as their contribution to the uncertainty in the activity reported for three typical determinations for environmental radioactivity measurements in the laboratory. It also studies the corresponding characteristic limits and their dependence on the standard uncertainty related to those weighting factors, offering an analysis of the effectiveness of the simplified characteristic limits as evaluated by various measuring software and laboratories

  11. Uncertainty evaluation of a modified elimination weighing for source preparation

    Energy Technology Data Exchange (ETDEWEB)

    Cacais, F.L.; Loayza, V.M., E-mail: facacais@gmail.com [Instituto Nacional de Metrologia, Qualidade e Tecnologia, (INMETRO), Rio de Janeiro, RJ (Brazil); Delgado, J.U. [Instituto de Radioproteção e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. de Metrologia das Radiações Ionizantes

    2017-07-01

    Some modification in elimination weighing method for radioactive source allowed correcting weighing results without non-linearity problems assign a uncertainty contribution for the correction of the same order of the mass of drop uncertainty and check weighing variability in series source preparation. This analysis has focused in knowing the achievable weighing accuracy and the uncertainty estimated by Monte Carlo method for a mass of a 20 mg drop was at maximum of 0.06%. (author)

  12. Limited entropic uncertainty as new principle of quantum physics

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.

    2001-01-01

    The Uncertainty Principle (UP) of quantum mechanics discovered by Heisenberg, which constitute the corner-stone of quantum physics, asserts that: there is an irreducible lower bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables. In order to avoid this state-dependence many authors proposed to use the information entropy as a measure of the uncertainty instead of above standard quantitative formulation of the Heisenberg uncertainty principle. In this paper the Principle of Limited Entropic Uncertainty (LEU-Principle), as a new principle in quantum physics, is proved. Then, consistent experimental tests of the LEU-principle, obtained by using the available 49 sets of the pion-nucleus phase shifts, are presented for both, extensive (q=1) and nonextensive (q=0.5 and q=2.0) cases. Some results obtained by the application of LEU-Principle to the diffraction phenomena are also discussed. The main results and conclusions of our paper can be summarized as follows: (i) We introduced a new principle in quantum physics namely the Principle of Limited Entropic Uncertainty (LEU-Principle). This new principle includes in a more general and exact form not only the old Heisenberg uncertainty principle but also introduce an upper limit on the magnitude of the uncertainty in the quantum physics. The LEU-Principle asserts that: 'there is an irreducible lower bound as well as an upper bound on the uncertainty in the result of a simultaneous measurement of non-commuting observables for any extensive and nonextensive (q ≥ 0) quantum systems'; (ii) Two important concrete realizations of the LEU-Principle are explicitly obtained in this paper, namely: (a) the LEU-inequalities for the quantum scattering of spinless particles and (b) the LEU-inequalities for the diffraction on single slit of width 2a. In particular from our general results, in the limit y → +1 we recover in an exact form all the results previously reported. In our paper an

  13. Some sources of the underestimation of evaluated cross section uncertainties

    International Nuclear Information System (INIS)

    Badikov, S.A.; Gai, E.V.

    2003-01-01

    The problem of the underestimation of evaluated cross-section uncertainties is addressed. Two basic sources of the underestimation of evaluated cross-section uncertainties - a) inconsistency between declared and observable experimental uncertainties and b) inadequacy between applied statistical models and processed experimental data - are considered. Both the sources of the underestimation are mainly a consequence of existence of the uncertainties unrecognized by experimenters. A model of a 'constant shift' is proposed for taking unrecognised experimental uncertainties into account. The model is applied for statistical analysis of the 238 U(n,f)/ 235 U(n,f) reaction cross-section ratio measurements. It is demonstrated that multiplication by sqrt(χ 2 ) as instrument for correction of underestimated evaluated cross-section uncertainties fails in case of correlated measurements. It is shown that arbitrary assignment of uncertainties and correlation in a simple least squares fit of two correlated measurements of unknown mean leads to physically incorrect evaluated results. (author)

  14. Uncertainty sources in radiopharmaceuticals clinical studies

    International Nuclear Information System (INIS)

    Degenhardt, Aemilie Louize; Oliveira, Silvia Maria Velasques de

    2014-01-01

    The radiopharmaceuticals should be approved for consumption by evaluating their quality, safety and efficacy. Clinical studies are designed to verify the pharmacodynamics, pharmacological and clinical effects in humans and are required for assuring safety and efficacy. The Bayesian analysis has been used for clinical studies effectiveness evaluation. This work aims to identify uncertainties associated with the process of production of the radionuclide and radiopharmaceutical labelling as well as the radiopharmaceutical administration and scintigraphy images acquisition and processing. For the development of clinical studies in the country, the metrological chain shall assure the traceability of the surveys performed in all phases. (author)

  15. OpenTURNS, an open source uncertainty engineering software

    International Nuclear Information System (INIS)

    Popelin, A.L.; Dufoy, A.

    2013-01-01

    The needs to assess robust performances for complex systems have lead to the emergence of a new industrial simulation challenge: to take into account uncertainties when dealing with complex numerical simulation frameworks. EDF has taken part in the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk and Statistics. OpenTURNS includes a large variety of qualified algorithms in order to manage uncertainties in industrial studies, from the uncertainty quantification step (with possibilities to model stochastic dependence thanks to the copula theory and stochastic processes), to the uncertainty propagation step (with some innovative simulation algorithms as the ziggurat method for normal variables) and the sensitivity analysis one (with some sensitivity index based on the evaluation of means conditioned to the realization of a particular event). It also enables to build some response surfaces that can include the stochastic modeling (with the chaos polynomial method for example). Generic wrappers to link OpenTURNS to the modeling software are proposed. At last, OpenTURNS is largely documented to provide rules to help use and contribution

  16. Perceived Uncertainty Sources in Wind Power Plant Design

    Energy Technology Data Exchange (ETDEWEB)

    Damiani, Rick R [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-01-03

    This presentation for the Fourth Wind Energy Systems Engineering Workshop covers some of the uncertainties that still impact turbulent wind operation and how these affect design and structural reliability; identifies key sources and prioritization for R and D; and summarizes an analysis of current procedures, industry best practice, standards, and expert opinions.

  17. Error Analysis of CM Data Products Sources of Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, Brian D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eckert-Gallup, Aubrey Celia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cochran, Lainy Dromgoole [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kraus, Terrence D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Allen, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Beal, Bill [National Security Technologies, Joint Base Andrews, MD (United States); Okada, Colin [National Security Technologies, LLC. (NSTec), Las Vegas, NV (United States); Simpson, Mathew [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-01

    This goal of this project is to address the current inability to assess the overall error and uncertainty of data products developed and distributed by DOE’s Consequence Management (CM) Program. This is a widely recognized shortfall, the resolution of which would provide a great deal of value and defensibility to the analysis results, data products, and the decision making process that follows this work. A global approach to this problem is necessary because multiple sources of error and uncertainty contribute to the ultimate production of CM data products. Therefore, this project will require collaboration with subject matter experts across a wide range of FRMAC skill sets in order to quantify the types of uncertainty that each area of the CM process might contain and to understand how variations in these uncertainty sources contribute to the aggregated uncertainty present in CM data products. The ultimate goal of this project is to quantify the confidence level of CM products to ensure that appropriate public and worker protections decisions are supported by defensible analysis.

  18. Sources of uncertainty in future changes in local precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Rowell, David P. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-10-15

    This study considers the large uncertainty in projected changes in local precipitation. It aims to map, and begin to understand, the relative roles of uncertain modelling and natural variability, using 20-year mean data from four perturbed physics or multi-model ensembles. The largest - 280-member - ensemble illustrates a rich pattern in the varying contribution of modelling uncertainty, with similar features found using a CMIP3 ensemble (despite its limited sample size, which restricts it value in this context). The contribution of modelling uncertainty to the total uncertainty in local precipitation change is found to be highest in the deep tropics, particularly over South America, Africa, the east and central Pacific, and the Atlantic. In the moist maritime tropics, the highly uncertain modelling of sea-surface temperature changes is transmitted to a large uncertain modelling of local rainfall changes. Over tropical land and summer mid-latitude continents (and to a lesser extent, the tropical oceans), uncertain modelling of atmospheric processes, land surface processes and the terrestrial carbon cycle all appear to play an additional substantial role in driving the uncertainty of local rainfall changes. In polar regions, inter-model variability of anomalous sea ice drives an uncertain precipitation response, particularly in winter. In all these regions, there is therefore the potential to reduce the uncertainty of local precipitation changes through targeted model improvements and observational constraints. In contrast, over much of the arid subtropical and mid-latitude oceans, over Australia, and over the Sahara in winter, internal atmospheric variability dominates the uncertainty in projected precipitation changes. Here, model improvements and observational constraints will have little impact on the uncertainty of time means shorter than at least 20 years. Last, a supplementary application of the metric developed here is that it can be interpreted as a measure

  19. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    Energy Technology Data Exchange (ETDEWEB)

    Pourgol-Mohammad, Mohammad, E-mail: pourgolmohammad@sut.ac.ir [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of); Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Hoseyni, Seyed Mojtaba [Building & Housing Research Center, Tehran (Iran, Islamic Republic of); Sepanloo, Kamran [Nuclear Science and Technology Research Institute, Tehran (Iran, Islamic Republic of)

    2016-08-15

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  20. A practical sensitivity analysis method for ranking sources of uncertainty in thermal–hydraulics applications

    International Nuclear Information System (INIS)

    Pourgol-Mohammad, Mohammad; Hoseyni, Seyed Mohsen; Hoseyni, Seyed Mojtaba; Sepanloo, Kamran

    2016-01-01

    Highlights: • Existing uncertainty ranking methods prove inconsistent for TH applications. • Introduction of a new method for ranking sources of uncertainty in TH codes. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • The importance of parameters is calculated by a limited number of TH code executions. • Methodology is applied successfully on LOFT-LB1 test facility. - Abstract: In application to thermal–hydraulic calculations by system codes, sensitivity analysis plays an important role for managing the uncertainties of code output and risk analysis. Sensitivity analysis is also used to confirm the results of qualitative Phenomena Identification and Ranking Table (PIRT). Several methodologies have been developed to address uncertainty importance assessment. Generally, uncertainty importance measures, mainly devised for the Probabilistic Risk Assessment (PRA) applications, are not affordable for computationally demanding calculations of the complex thermal–hydraulics (TH) system codes. In other words, for effective quantification of the degree of the contribution of each phenomenon to the total uncertainty of the output, a practical approach is needed by considering high computational burden of TH calculations. This study aims primarily to show the inefficiency of the existing approaches and then introduces a solution to cope with the challenges in this area by modification of variance-based uncertainty importance method. Important parameters are identified by the modified PIRT approach qualitatively then their uncertainty importance is quantified by a local derivative index. The proposed index is attractive from its practicality point of view on TH applications. It is capable of calculating the importance of parameters by a limited number of TH code executions. Application of the proposed methodology is demonstrated on LOFT-LB1 test facility.

  1. Effluent release limits, sources and control

    International Nuclear Information System (INIS)

    Swindell, G.E.

    1977-01-01

    Objectives of radiation protection in relation to releases. Environmental transfer models for radionuclides. Relationship between releases, environmental levels and doses to persons. Establishment of release limits: Limits based on critical population group concept critical pathway analysis and identification of critical group. Limits based on optimization of radiation protection individual dose limits, collective doses and dose commitments 1) differential cost benefit analysis 2) authorized and operational limits taking account of future exposures. Monitoring of releases to the environment: Objectives of effluent monitoring. Typical sources and composition of effluents; design and operation of monitoring programmes; recording and reporting of monitoring results; complementary environmental monitoring. (orig.) [de

  2. An audit of the global carbon budget: identifying and reducing sources of uncertainty

    Science.gov (United States)

    Ballantyne, A. P.; Tans, P. P.; Marland, G.; Stocker, B. D.

    2012-12-01

    Uncertainties in our carbon accounting practices may limit our ability to objectively verify emission reductions on regional scales. Furthermore uncertainties in the global C budget must be reduced to benchmark Earth System Models that incorporate carbon-climate interactions. Here we present an audit of the global C budget where we try to identify sources of uncertainty for major terms in the global C budget. The atmospheric growth rate of CO2 has increased significantly over the last 50 years, while the uncertainty in calculating the global atmospheric growth rate has been reduced from 0.4 ppm/yr to 0.2 ppm/yr (95% confidence). Although we have greatly reduced global CO2 growth rate uncertainties, there remain regions, such as the Southern Hemisphere, Tropics and Arctic, where changes in regional sources/sinks will remain difficult to detect without additional observations. Increases in fossil fuel (FF) emissions are the primary factor driving the increase in global CO2 growth rate; however, our confidence in FF emission estimates has actually gone down. Based on a comparison of multiple estimates, FF emissions have increased from 2.45 ± 0.12 PgC/yr in 1959 to 9.40 ± 0.66 PgC/yr in 2010. Major sources of increasing FF emission uncertainty are increased emissions from emerging economies, such as China and India, as well as subtle differences in accounting practices. Lastly, we evaluate emission estimates from Land Use Change (LUC). Although relative errors in emission estimates from LUC are quite high (2 sigma ~ 50%), LUC emissions have remained fairly constant in recent decades. We evaluate the three commonly used approaches to estimating LUC emissions- Bookkeeping, Satellite Imagery, and Model Simulations- to identify their main sources of error and their ability to detect net emissions from LUC.; Uncertainties in Fossil Fuel Emissions over the last 50 years.

  3. ON COMPUTING UPPER LIMITS TO SOURCE INTENSITIES

    International Nuclear Information System (INIS)

    Kashyap, Vinay L.; Siemiginowska, Aneta; Van Dyk, David A.; Xu Jin; Connors, Alanna; Freeman, Peter E.; Zezas, Andreas

    2010-01-01

    A common problem in astrophysics is determining how bright a source could be and still not be detected in an observation. Despite the simplicity with which the problem can be stated, the solution involves complicated statistical issues that require careful analysis. In contrast to the more familiar confidence bound, this concept has never been formally analyzed, leading to a great variety of often ad hoc solutions. Here we formulate and describe the problem in a self-consistent manner. Detection significance is usually defined by the acceptable proportion of false positives (background fluctuations that are claimed as detections, or Type I error), and we invoke the complementary concept of false negatives (real sources that go undetected, or Type II error), based on the statistical power of a test, to compute an upper limit to the detectable source intensity. To determine the minimum intensity that a source must have for it to be detected, we first define a detection threshold and then compute the probabilities of detecting sources of various intensities at the given threshold. The intensity that corresponds to the specified Type II error probability defines that minimum intensity and is identified as the upper limit. Thus, an upper limit is a characteristic of the detection procedure rather than the strength of any particular source. It should not be confused with confidence intervals or other estimates of source intensity. This is particularly important given the large number of catalogs that are being generated from increasingly sensitive surveys. We discuss, with examples, the differences between these upper limits and confidence bounds. Both measures are useful quantities that should be reported in order to extract the most science from catalogs, though they answer different statistical questions: an upper bound describes an inference range on the source intensity, while an upper limit calibrates the detection process. We provide a recipe for computing upper

  4. Quantification of uncertainties in source term estimates for a BWR with Mark I containment

    International Nuclear Information System (INIS)

    Khatib-Rahbar, M.; Cazzoli, E.; Davis, R.; Ishigami, T.; Lee, M.; Nourbakhsh, H.; Schmidt, E.; Unwin, S.

    1988-01-01

    A methodology for quantification and uncertainty analysis of source terms for severe accident in light water reactors (QUASAR) has been developed. The objectives of the QUASAR program are (1) to develop a framework for performing an uncertainty evaluation of the input parameters of the phenomenological models used in the Source Term Code Package (STCP), and (2) to quantify the uncertainties in certain phenomenological aspects of source terms (that are not modeled by STCP) using state-of-the-art methods. The QUASAR methodology consists of (1) screening sensitivity analysis, where the most sensitive input variables are selected for detailed uncertainty analysis, (2) uncertainty analysis, where probability density functions (PDFs) are established for the parameters identified by the screening stage and propagated through the codes to obtain PDFs for the outputs (i.e., release fractions to the environment), and (3) distribution sensitivity analysis, which is performed to determine the sensitivity of the output PDFs to the input PDFs. In this paper attention is limited to a single accident progression sequence, namely; a station blackout accident in a BWR with a Mark I containment buildings. Identified as an important accident in the draft NUREG-1150 a station blackout involves loss of both off-site power and DC power resulting in failure of the diesels to start and in the unavailability of the high pressure injection and core isolation coding systems

  5. Exploring the uncertainty in attributing sediment contributions in fingerprinting studies due to uncertainty in determining element concentrations in source areas.

    Science.gov (United States)

    Gomez, Jose Alfonso; Owens, Phillip N.; Koiter, Alex J.; Lobb, David

    2016-04-01

    One of the major sources of uncertainty in attributing sediment sources in fingerprinting studies is the uncertainty in determining the concentrations of the elements used in the mixing model due to the variability of the concentrations of these elements in the source materials (e.g., Kraushaar et al., 2015). The uncertainty in determining the "true" concentration of a given element in each one of the source areas depends on several factors, among them the spatial variability of that element, the sampling procedure and sampling density. Researchers have limited control over these factors, and usually sampling density tends to be sparse, limited by time and the resources available. Monte Carlo analysis has been used regularly in fingerprinting studies to explore the probable solutions within the measured variability of the elements in the source areas, providing an appraisal of the probability of the different solutions (e.g., Collins et al., 2012). This problem can be considered analogous to the propagation of uncertainty in hydrologic models due to uncertainty in the determination of the values of the model parameters, and there are many examples of Monte Carlo analysis of this uncertainty (e.g., Freeze, 1980; Gómez et al., 2001). Some of these model analyses rely on the simulation of "virtual" situations that were calibrated from parameter values found in the literature, with the purpose of providing insight about the response of the model to different configurations of input parameters. This approach - evaluating the answer for a "virtual" problem whose solution could be known in advance - might be useful in evaluating the propagation of uncertainty in mixing models in sediment fingerprinting studies. In this communication, we present the preliminary results of an on-going study evaluating the effect of variability of element concentrations in source materials, sampling density, and the number of elements included in the mixing models. For this study a virtual

  6. Tritium source-related systematic uncertainties of the KATRIN experiment

    Energy Technology Data Exchange (ETDEWEB)

    Seitz-Moskaliuk, Hendrik [Karlsruher Institut fuer Technologie, Institut fuer experimentelle Kernphysik, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Collaboration: KATRIN-Collaboration

    2016-07-01

    KATRIN will perform a direct, kinematics-based measurement of the neutrino mass with a sensitivity of 200 meV (90 % C. L.) reached after 3 years of measurement time. The neutrino mass is obtained by determining the shape of the spectrum of tritium β decay electrons close to the endpoint of 18.6 keV with a spectrometer of MAC-E filter type. To achieve the planned sensitivity, the systematic measurement uncertainties have to be carefully controlled and evaluated. Main sources of systematics are the MAC-E filter on the one hand and the source and transport section (STS) on the other hand. Most of the operational parameters of KATRIN have to be stable at or even below the per mille level and have to meet further strict requirements. This talk reviews the KATRIN systematics with a special focus on the STS. Early commissioning measurements to determine the main systematics are introduced.

  7. The Source Inversion Validation (SIV) Initiative: A Collaborative Study on Uncertainty Quantification in Earthquake Source Inversions

    Science.gov (United States)

    Mai, P. M.; Schorlemmer, D.; Page, M.

    2012-04-01

    Earthquake source inversions image the spatio-temporal rupture evolution on one or more fault planes using seismic and/or geodetic data. Such studies are critically important for earthquake seismology in general, and for advancing seismic hazard analysis in particular, as they reveal earthquake source complexity and help (i) to investigate earthquake mechanics; (ii) to develop spontaneous dynamic rupture models; (iii) to build models for generating rupture realizations for ground-motion simulations. In applications (i - iii), the underlying finite-fault source models are regarded as "data" (input information), but their uncertainties are essentially unknown. After all, source models are obtained from solving an inherently ill-posed inverse problem to which many a priori assumptions and uncertain observations are applied. The Source Inversion Validation (SIV) project is a collaborative effort to better understand the variability between rupture models for a single earthquake (as manifested in the finite-source rupture model database) and to develop robust uncertainty quantification for earthquake source inversions. The SIV project highlights the need to develop a long-standing and rigorous testing platform to examine the current state-of-the-art in earthquake source inversion, and to develop and test novel source inversion approaches. We will review the current status of the SIV project, and report the findings and conclusions of the recent workshops. We will briefly discuss several source-inversion methods, how they treat uncertainties in data, and assess the posterior model uncertainty. Case studies include initial forward-modeling tests on Green's function calculations, and inversion results for synthetic data from spontaneous dynamic crack-like strike-slip earthquake on steeply dipping fault, embedded in a layered crustal velocity-density structure.

  8. Radiofrequency Electromagnetic Radiation and Memory Performance: Sources of Uncertainty in Epidemiological Cohort Studies.

    Science.gov (United States)

    Brzozek, Christopher; Benke, Kurt K; Zeleke, Berihun M; Abramson, Michael J; Benke, Geza

    2018-03-26

    Uncertainty in experimental studies of exposure to radiation from mobile phones has in the past only been framed within the context of statistical variability. It is now becoming more apparent to researchers that epistemic or reducible uncertainties can also affect the total error in results. These uncertainties are derived from a wide range of sources including human error, such as data transcription, model structure, measurement and linguistic errors in communication. The issue of epistemic uncertainty is reviewed and interpreted in the context of the MoRPhEUS, ExPOSURE and HERMES cohort studies which investigate the effect of radiofrequency electromagnetic radiation from mobile phones on memory performance. Research into this field has found inconsistent results due to limitations from a range of epistemic sources. Potential analytic approaches are suggested based on quantification of epistemic error using Monte Carlo simulation. It is recommended that future studies investigating the relationship between radiofrequency electromagnetic radiation and memory performance pay more attention to treatment of epistemic uncertainties as well as further research into improving exposure assessment. Use of directed acyclic graphs is also encouraged to display the assumed covariate relationship.

  9. Radiofrequency Electromagnetic Radiation and Memory Performance: Sources of Uncertainty in Epidemiological Cohort Studies

    Directory of Open Access Journals (Sweden)

    Christopher Brzozek

    2018-03-01

    Full Text Available Uncertainty in experimental studies of exposure to radiation from mobile phones has in the past only been framed within the context of statistical variability. It is now becoming more apparent to researchers that epistemic or reducible uncertainties can also affect the total error in results. These uncertainties are derived from a wide range of sources including human error, such as data transcription, model structure, measurement and linguistic errors in communication. The issue of epistemic uncertainty is reviewed and interpreted in the context of the MoRPhEUS, ExPOSURE and HERMES cohort studies which investigate the effect of radiofrequency electromagnetic radiation from mobile phones on memory performance. Research into this field has found inconsistent results due to limitations from a range of epistemic sources. Potential analytic approaches are suggested based on quantification of epistemic error using Monte Carlo simulation. It is recommended that future studies investigating the relationship between radiofrequency electromagnetic radiation and memory performance pay more attention to treatment of epistemic uncertainties as well as further research into improving exposure assessment. Use of directed acyclic graphs is also encouraged to display the assumed covariate relationship.

  10. Calibration of C-14 dates: some remaining uncertainties and limitations

    International Nuclear Information System (INIS)

    Burleigh, R.

    1975-01-01

    A brief review is presented of the interpretation of radiocarbon dates in terms of calendar years. An outline is given of the factors that make such correlations necessary and of the work that has so far been done to make them possible. The calibration of the C-14 timescale very largely depends at present on the bristlecone pine chronology, but it is clear that many detailed uncertainties still remain. These are discussed. (U.K.)

  11. Projecting species' vulnerability to climate change: Which uncertainty sources matter most and extrapolate best?

    Science.gov (United States)

    Steen, Valerie; Sofaer, Helen R; Skagen, Susan K; Ray, Andrea J; Noon, Barry R

    2017-11-01

    Species distribution models (SDMs) are commonly used to assess potential climate change impacts on biodiversity, but several critical methodological decisions are often made arbitrarily. We compare variability arising from these decisions to the uncertainty in future climate change itself. We also test whether certain choices offer improved skill for extrapolating to a changed climate and whether internal cross-validation skill indicates extrapolative skill. We compared projected vulnerability for 29 wetland-dependent bird species breeding in the climatically dynamic Prairie Pothole Region, USA. For each species we built 1,080 SDMs to represent a unique combination of: future climate, class of climate covariates, collinearity level, and thresholding procedure. We examined the variation in projected vulnerability attributed to each uncertainty source. To assess extrapolation skill under a changed climate, we compared model predictions with observations from historic drought years. Uncertainty in projected vulnerability was substantial, and the largest source was that of future climate change. Large uncertainty was also attributed to climate covariate class with hydrological covariates projecting half the range loss of bioclimatic covariates or other summaries of temperature and precipitation. We found that choices based on performance in cross-validation improved skill in extrapolation. Qualitative rankings were also highly uncertain. Given uncertainty in projected vulnerability and resulting uncertainty in rankings used for conservation prioritization, a number of considerations appear critical for using bioclimatic SDMs to inform climate change mitigation strategies. Our results emphasize explicitly selecting climate summaries that most closely represent processes likely to underlie ecological response to climate change. For example, hydrological covariates projected substantially reduced vulnerability, highlighting the importance of considering whether water

  12. Internal noise sources limiting contrast sensitivity.

    Science.gov (United States)

    Silvestre, Daphné; Arleo, Angelo; Allard, Rémy

    2018-02-07

    Contrast sensitivity varies substantially as a function of spatial frequency and luminance intensity. The variation as a function of luminance intensity is well known and characterized by three laws that can be attributed to the impact of three internal noise sources: early spontaneous neural activity limiting contrast sensitivity at low luminance intensities (i.e. early noise responsible for the linear law), probabilistic photon absorption at intermediate luminance intensities (i.e. photon noise responsible for de Vries-Rose law) and late spontaneous neural activity at high luminance intensities (i.e. late noise responsible for Weber's law). The aim of this study was to characterize how the impact of these three internal noise sources vary with spatial frequency and determine which one is limiting contrast sensitivity as a function of luminance intensity and spatial frequency. To estimate the impact of the different internal noise sources, the current study used an external noise paradigm to factorize contrast sensitivity into equivalent input noise and calculation efficiency over a wide range of luminance intensities and spatial frequencies. The impact of early and late noise was found to drop linearly with spatial frequency, whereas the impact of photon noise rose with spatial frequency due to ocular factors.

  13. Uncertainty in Earthquake Source Imaging Due to Variations in Source Time Function and Earth Structure

    KAUST Repository

    Razafindrakoto, H. N. T.

    2014-03-25

    One way to improve the accuracy and reliability of kinematic earthquake source imaging is to investigate the origin of uncertainty and to minimize their effects. The difficulties in kinematic source inversion arise from the nonlinearity of the problem, nonunique choices in the parameterization, and observational errors. We analyze particularly the uncertainty related to the choice of the source time function (STF) and the variability in Earth structure. We consider a synthetic data set generated from a spontaneous dynamic rupture calculation. Using Bayesian inference, we map the solution space of peak slip rate, rupture time, and rise time to characterize the kinematic rupture in terms of posterior density functions. Our test to investigate the effect of the choice of STF reveals that all three tested STFs (isosceles triangle, regularized Yoffe with acceleration time of 0.1 and 0.3 s) retrieve the patch of high slip and slip rate around the hypocenter. However, the use of an isosceles triangle as STF artificially accelerates the rupture to propagate faster than the target solution. It additionally generates an artificial linear correlation between rupture onset time and rise time. These appear to compensate for the dynamic source effects that are not included in the symmetric triangular STF. The exact rise time for the tested STFs is difficult to resolve due to the small amount of radiated seismic moment in the tail of STF. To highlight the effect of Earth structure variability, we perform inversions including the uncertainty in the wavespeed only, and variability in both wavespeed and layer depth. We find that little difference is noticeable between the resulting rupture model uncertainties from these two parameterizations. Both significantly broaden the posterior densities and cause faster rupture propagation particularly near the hypocenter due to the major velocity change at the depth where the fault is located.

  14. Uncertainty in Earthquake Source Imaging Due to Variations in Source Time Function and Earth Structure

    KAUST Repository

    Razafindrakoto, H. N. T.; Mai, Paul Martin

    2014-01-01

    One way to improve the accuracy and reliability of kinematic earthquake source imaging is to investigate the origin of uncertainty and to minimize their effects. The difficulties in kinematic source inversion arise from the nonlinearity of the problem, nonunique choices in the parameterization, and observational errors. We analyze particularly the uncertainty related to the choice of the source time function (STF) and the variability in Earth structure. We consider a synthetic data set generated from a spontaneous dynamic rupture calculation. Using Bayesian inference, we map the solution space of peak slip rate, rupture time, and rise time to characterize the kinematic rupture in terms of posterior density functions. Our test to investigate the effect of the choice of STF reveals that all three tested STFs (isosceles triangle, regularized Yoffe with acceleration time of 0.1 and 0.3 s) retrieve the patch of high slip and slip rate around the hypocenter. However, the use of an isosceles triangle as STF artificially accelerates the rupture to propagate faster than the target solution. It additionally generates an artificial linear correlation between rupture onset time and rise time. These appear to compensate for the dynamic source effects that are not included in the symmetric triangular STF. The exact rise time for the tested STFs is difficult to resolve due to the small amount of radiated seismic moment in the tail of STF. To highlight the effect of Earth structure variability, we perform inversions including the uncertainty in the wavespeed only, and variability in both wavespeed and layer depth. We find that little difference is noticeable between the resulting rupture model uncertainties from these two parameterizations. Both significantly broaden the posterior densities and cause faster rupture propagation particularly near the hypocenter due to the major velocity change at the depth where the fault is located.

  15. Determination of radionuclide solubility limits to be used in SR 97. Uncertainties associated to calculated solubilities

    Energy Technology Data Exchange (ETDEWEB)

    Bruno, J.; Cera, E.; Duro, L.; Jordana, S. [QuantiSci S.L., Barcelona (Spain); Pablo, J. de [DEQ-UPC, Barcelona (Spain); Savage, D. [QuantiSci Ltd., Henley-on-Thames (United Kingdom)

    1997-12-01

    The thermochemical behaviour of 24 critical radionuclides for the forthcoming SR97 PA exercise is discussed. The available databases are reviewed and updated with new data and an extended database for aqueous and solid species of the radionuclides of interest is proposed. We have calculated solubility limits for the radionuclides of interest under different groundwater compositions. A sensitivity analysis of the calculated solubilities with the composition of the groundwater is presented. Besides selecting the most likely solubility limiting phases, in this work we have used coprecipitation approaches in order to calculate more realistic solubility limits for minor radionuclides, such as Ra, Am and Cm. The comparison between the calculated solubilities and the concentrations measured in relevant natural systems (NA) and in spent fuel leaching experiments helps to assess the validity of the methodology used and to derive source term concentrations for the radionuclides studied. The uncertainties associated to the solubilities of the main radionuclides involved in the spent nuclear fuel have also been discussed in this work. The variability of the groundwater chemistry; redox conditions and temperature of the system have been considered the main factors affecting the solubilities. In this case, a sensitivity analysis has been performed in order to study solubility changes as a function of these parameters. The uncertainties have been calculated by including the values found in a major extent in typical granitic groundwaters. The results obtained from this analysis indicate that there are some radionuclides which are not affected by these parameters, i.e. Ag, Cm, Ho, Nb, Ni, Np, Pu, Se, Sm, Sn, Sr, Tc and U

  16. Improved statistical models for limited datasets in uncertainty quantification using stochastic collocation

    Energy Technology Data Exchange (ETDEWEB)

    Alwan, Aravind; Aluru, N.R.

    2013-12-15

    This paper presents a data-driven framework for performing uncertainty quantification (UQ) by choosing a stochastic model that accurately describes the sources of uncertainty in a system. This model is propagated through an appropriate response surface function that approximates the behavior of this system using stochastic collocation. Given a sample of data describing the uncertainty in the inputs, our goal is to estimate a probability density function (PDF) using the kernel moment matching (KMM) method so that this PDF can be used to accurately reproduce statistics like mean and variance of the response surface function. Instead of constraining the PDF to be optimal for a particular response function, we show that we can use the properties of stochastic collocation to make the estimated PDF optimal for a wide variety of response functions. We contrast this method with other traditional procedures that rely on the Maximum Likelihood approach, like kernel density estimation (KDE) and its adaptive modification (AKDE). We argue that this modified KMM method tries to preserve what is known from the given data and is the better approach when the available data is limited in quantity. We test the performance of these methods for both univariate and multivariate density estimation by sampling random datasets from known PDFs and then measuring the accuracy of the estimated PDFs, using the known PDF as a reference. Comparing the output mean and variance estimated with the empirical moments using the raw data sample as well as the actual moments using the known PDF, we show that the KMM method performs better than KDE and AKDE in predicting these moments with greater accuracy. This improvement in accuracy is also demonstrated for the case of UQ in electrostatic and electrothermomechanical microactuators. We show how our framework results in the accurate computation of statistics in micromechanical systems.

  17. Improved statistical models for limited datasets in uncertainty quantification using stochastic collocation

    International Nuclear Information System (INIS)

    Alwan, Aravind; Aluru, N.R.

    2013-01-01

    This paper presents a data-driven framework for performing uncertainty quantification (UQ) by choosing a stochastic model that accurately describes the sources of uncertainty in a system. This model is propagated through an appropriate response surface function that approximates the behavior of this system using stochastic collocation. Given a sample of data describing the uncertainty in the inputs, our goal is to estimate a probability density function (PDF) using the kernel moment matching (KMM) method so that this PDF can be used to accurately reproduce statistics like mean and variance of the response surface function. Instead of constraining the PDF to be optimal for a particular response function, we show that we can use the properties of stochastic collocation to make the estimated PDF optimal for a wide variety of response functions. We contrast this method with other traditional procedures that rely on the Maximum Likelihood approach, like kernel density estimation (KDE) and its adaptive modification (AKDE). We argue that this modified KMM method tries to preserve what is known from the given data and is the better approach when the available data is limited in quantity. We test the performance of these methods for both univariate and multivariate density estimation by sampling random datasets from known PDFs and then measuring the accuracy of the estimated PDFs, using the known PDF as a reference. Comparing the output mean and variance estimated with the empirical moments using the raw data sample as well as the actual moments using the known PDF, we show that the KMM method performs better than KDE and AKDE in predicting these moments with greater accuracy. This improvement in accuracy is also demonstrated for the case of UQ in electrostatic and electrothermomechanical microactuators. We show how our framework results in the accurate computation of statistics in micromechanical systems

  18. Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhen [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFranchi, Brian W. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ivey, Mark D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Schrader, Paul E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Michelsen, Hope A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bambha, Ray P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2014-09-01

    In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO2 . This will allow for the examination of regional-scale transport and distribution of CO2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF

  19. Sources of uncertainty in individual monitoring for photographic,TL and OSL dosimetry techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Max S.; Silva, Everton R.; Mauricio, Claudia L.P., E-mail: max.das.ferreira@gmail.com, E-mail: everton@ird.gov.br, E-mail: claudia@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The identification of the uncertainty sources and their quantification is essential to the quality of any dosimetric results. If uncertainties are not stated for all dose measurements informed in the monthly dose report to the monitored radiation facilities, they need to be known. This study aims to analyze the influence of different sources of uncertainties associated with photographic, TL and OSL dosimetric techniques, considering the evaluation of occupational doses of whole-body exposure for photons. To identify the sources of uncertainty it was conducted a bibliographic review in specific documents that deal with operational aspects of each technique and the uncertainties associated to each of them. Withal, technical visits to individual monitoring services were conducted to assist in this identification. The sources of uncertainty were categorized and their contributions were expressed in a qualitative way. The process of calibration and traceability are the most important sources of uncertainties, regardless the technique used. For photographic dosimetry, the remaining important uncertainty sources are due to: energy and angular dependence; linearity of response; variations in the films processing. For TL and OSL, the key process for a good performance is respectively the reproducibility of the thermal and optical cycles. For the three techniques, all procedures of the measurement process must be standardized, controlled and reproducible. Further studies can be performed to quantify the contribution of the sources of uncertainty. (author)

  20. Calculation of the detection limit in radiation measurements with systematic uncertainties

    International Nuclear Information System (INIS)

    Kirkpatrick, J.M.; Russ, W.; Venkataraman, R.; Young, B.M.

    2015-01-01

    The detection limit (L D ) or Minimum Detectable Activity (MDA) is an a priori evaluation of assay sensitivity intended to quantify the suitability of an instrument or measurement arrangement for the needs of a given application. Traditional approaches as pioneered by Currie rely on Gaussian approximations to yield simple, closed-form solutions, and neglect the effects of systematic uncertainties in the instrument calibration. These approximations are applicable over a wide range of applications, but are of limited use in low-count applications, when high confidence values are required, or when systematic uncertainties are significant. One proposed modification to the Currie formulation attempts account for systematic uncertainties within a Gaussian framework. We have previously shown that this approach results in an approximation formula that works best only for small values of the relative systematic uncertainty, for which the modification of Currie's method is the least necessary, and that it significantly overestimates the detection limit or gives infinite or otherwise non-physical results for larger systematic uncertainties where such a correction would be the most useful. We have developed an alternative approach for calculating detection limits based on realistic statistical modeling of the counting distributions which accurately represents statistical and systematic uncertainties. Instead of a closed form solution, numerical and iterative methods are used to evaluate the result. Accurate detection limits can be obtained by this method for the general case

  1. Sources/treatment of uncertainties in the performance assessment of geologic radioactive waste repositories

    International Nuclear Information System (INIS)

    Cranwell, R.M.

    1987-01-01

    Uncertainties in the performance assessment of geologic radioactive waste repositories have several sources. The more important ones include: 1) uncertainty in the conditions of a disposal system over the temporal scales set forth in regulations, 2) uncertainty in the conceptualization of the geohydrologic system, 3) uncertainty in the theoretical description of a given conceptual model of the system, 4) uncertainty in the development of computer codes to implement the solution of a mathematical model, and 5) uncertainty in the parameters and data required in the models and codes used to assess the long-term performance of the disposal system. This paper discusses each of these uncertainties and outlines methods for addressing these uncertainties

  2. Historic Emissions from Deforestation and Forest Degradation in Mato Grosso, Brazil: 1. Source Data Uncertainties

    Science.gov (United States)

    Morton, Douglas C.; Sales, Marcio H.; Souza, Carlos M., Jr.; Griscom, Bronson

    2011-01-01

    Historic carbon emissions are an important foundation for proposed efforts to Reduce Emissions from Deforestation and forest Degradation and enhance forest carbon stocks through conservation and sustainable forest management (REDD+). The level of uncertainty in historic carbon emissions estimates is also critical for REDD+, since high uncertainties could limit climate benefits from mitigation actions. Here, we analyzed source data uncertainties based on the range of available deforestation, forest degradation, and forest carbon stock estimates for the Brazilian state of Mato Grosso during 1990-2008. Results: Deforestation estimates showed good agreement for multi-year trends of increasing and decreasing deforestation during the study period. However, annual deforestation rates differed by >20% in more than half of the years between 1997-2008, even for products based on similar input data. Tier 2 estimates of average forest carbon stocks varied between 99-192 Mg C/ha, with greatest differences in northwest Mato Grosso. Carbon stocks in deforested areas increased over the study period, yet this increasing trend in deforested biomass was smaller than the difference among carbon stock datasets for these areas. Conclusions: Patterns of spatial and temporal disagreement among available data products provide a roadmap for future efforts to reduce source data uncertainties for estimates of historic forest carbon emissions. Specifically, regions with large discrepancies in available estimates of both deforestation and forest carbon stocks are priority areas for evaluating and improving existing estimates. Full carbon accounting for REDD+ will also require filling data gaps, including forest degradation and secondary forest, with annual data on all forest transitions.

  3. Upper limit for Poisson variable incorporating systematic uncertainties by Bayesian approach

    International Nuclear Information System (INIS)

    Zhu, Yongsheng

    2007-01-01

    To calculate the upper limit for the Poisson observable at given confidence level with inclusion of systematic uncertainties in background expectation and signal efficiency, formulations have been established along the line of Bayesian approach. A FORTRAN program, BPULE, has been developed to implement the upper limit calculation

  4. Evaluation on uncertainty sources in projecting hydrological changes over the Xijiang River basin in South China

    Science.gov (United States)

    Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren

    2017-11-01

    Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang

  5. Characterizing Sources of Uncertainty in Item Response Theory Scale Scores

    Science.gov (United States)

    Yang, Ji Seung; Hansen, Mark; Cai, Li

    2012-01-01

    Traditional estimators of item response theory scale scores ignore uncertainty carried over from the item calibration process, which can lead to incorrect estimates of the standard errors of measurement (SEMs). Here, the authors review a variety of approaches that have been applied to this problem and compare them on the basis of their statistical…

  6. Accounting for uncertainty in ecological analysis: the strengths and limitations of hierarchical statistical modeling.

    Science.gov (United States)

    Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K

    2009-04-01

    Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

  7. A formal treatment of uncertainty sources in a level 2 PSA

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon

    2003-01-01

    The methodological framework of the level 2 PSA appears to be currently standardized in a formalized fashion, but there have been different opinions on the way the sources of uncertainty are characterized and treated. This is primarily because the level 2 PSA deals with complex phenomenological processes that are deterministic in nature rather than random processes, and there are no probabilistic models characterizing them clearly. As a result, the probabilistic quantification of the level 2 PSA is often subjected to two sources of uncertainty: (a) incomplete modeling of accident pathways or different predictions for the behavior of phenomenological events and (b) expert-to-expert variation in estimating the occurrence probability of phenomenological events. While a clear definition of the two sources of uncertainty involved in the level 2 PSA makes it possible to treat an uncertainty in a consistent manner, careless application of these different sources of uncertainty may produce different conclusions in the decision-making process. The primary purpose of this paper is to characterize typical sources of uncertainty that would often be addressed in the level 2 PSA and their impacts on the PSA level 2 risk results. An additional purpose of this paper is to give a formal approach on how to combine random uncertainties addressed in the level 1 PSA with subjectivistic uncertainties addressed in the level 2 PSA

  8. A formal guidance for handling different uncertainty sources employed in the level 2 PSA

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eon; Ha, Jae Joo

    2004-01-01

    The methodological framework of the level 2 PSA appears to be currently standardized in a formalized fashion, but there have been different opinions on the way the sources of uncertainty are characterized and treated. This is primarily because the level 2 PSA deals with complex phenomenological processes that are deterministic in nature rather than random processes, and there are no probabilistic models characterizing them clearly. As a result, the probabilistic quantification of the level 2 PSA CET/APET is often subjected to two sources of uncertainty: (a) incomplete modeling of accident pathways or different predictions for the behavior of phenomenological events and (b) expert-to-expert variation in estimating the occurrence probability of phenomenological events. While a clear definition of the two sources of uncertainty involved in the level 2 PSA makes it possible to treat an uncertainty in a consistent manner, careless application of these different sources of uncertainty may produce different conclusions in the decision-making process. The primary purpose of this paper is to characterize typical sources of uncertainty that would often be addressed in the level 2 PSA and to provide a formal guidance for quantifying their impacts on the PSA level 2 risk results. An additional purpose of this paper is to give a formal approach on how to combine random uncertainties addressed in the level 1 PSA with subjectivistic uncertainties addressed in the level 2 PSA

  9. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  10. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  11. Added Value of uncertainty Estimates of SOurce term and Meteorology (AVESOME)

    DEFF Research Database (Denmark)

    Sørensen, Jens Havskov; Schönfeldt, Fredrik; Sigg, Robert

    In the early phase of a nuclear accident, two large sources of uncertainty exist: one related to the source term and one associated with the meteorological data. Operational methods are being developed in AVESOME for quantitative estimation of uncertainties in atmospheric dispersion prediction.......g. at national meteorological services, the proposed methodology is feasible for real-time use, thereby adding value to decision support. In the recent NKS-B projects MUD, FAUNA and MESO, the implications of meteorological uncertainties for nuclear emergency preparedness and management have been studied...... uncertainty in atmospheric dispersion model forecasting stemming from both the source term and the meteorological data is examined. Ways to implement the uncertainties of forecasting in DSSs, and the impacts on real-time emergency management are described. The proposed methodology allows for efficient real...

  12. Optimizing Irrigation Water Allocation under Multiple Sources of Uncertainty in an Arid River Basin

    Science.gov (United States)

    Wei, Y.; Tang, D.; Gao, H.; Ding, Y.

    2015-12-01

    Population growth and climate change add additional pressures affecting water resources management strategies for meeting demands from different economic sectors. It is especially challenging in arid regions where fresh water is limited. For instance, in the Tailanhe River Basin (Xinjiang, China), a compromise must be made between water suppliers and users during drought years. This study presents a multi-objective irrigation water allocation model to cope with water scarcity in arid river basins. To deal with the uncertainties from multiple sources in the water allocation system (e.g., variations of available water amount, crop yield, crop prices, and water price), the model employs a interval linear programming approach. The multi-objective optimization model developed from this study is characterized by integrating eco-system service theory into water-saving measures. For evaluation purposes, the model is used to construct an optimal allocation system for irrigation areas fed by the Tailan River (Xinjiang Province, China). The objective functions to be optimized are formulated based on these irrigation areas' economic, social, and ecological benefits. The optimal irrigation water allocation plans are made under different hydroclimate conditions (wet year, normal year, and dry year), with multiple sources of uncertainty represented. The modeling tool and results are valuable for advising decision making by the local water authority—and the agricultural community—especially on measures for coping with water scarcity (by incorporating uncertain factors associated with crop production planning).

  13. Sources of uncertainties in modelling black carbon at the global scale

    NARCIS (Netherlands)

    Vignati, E.; Karl, M.; Krol, M.C.; Wilson, J.; Stier, P.; Cavalli, F.

    2010-01-01

    Our understanding of the global black carbon (BC) cycle is essentially qualitative due to uncertainties in our knowledge of its properties. This work investigates two source of uncertainties in modelling black carbon: those due to the use of different schemes for BC ageing and its removal rate in

  14. Sources and performance criteria of uncertainty of reference measurement procedures.

    Science.gov (United States)

    Mosca, Andrea; Paleari, Renata

    2018-05-29

    This article wants to focus on the today available Reference Measurement Procedures (RMPs) for the determination of various analytes in Laboratory Medicine and the possible tools to evaluate their performance in the laboratories who are currently using them. A brief review on the RMPs has been performed by investigating the Joint Committee for Traceability in Laboratory Medicine (JCTLM) database. In order to evaluate their performances, we have checked the organization of three international ring trials, i.e. those regularly performed by the IFCC External Quality assessment scheme for Reference Laboratories in Laboratory Medicine (RELA), by the Center for Disease Control and Prevention (CDC) cholesterol network and by the IFCC Network for HbA 1c . Several RMPs are available through the JCTLM database, but the best way to collect information about the RMPs and their uncertainties is to look at the reference measurement service providers (RMS). This part of the database and the background on how to listed in the database is very helpful for the assessment of expanded uncertainty (MU) and performance in general of RMPs. Worldwide, 17 RMS are listed in the database, and for most of the measurands more than one RMS is able to run the relative RMPs, with similar expanded uncertainties. As an example, for a-amylase, 4 SP offer their services with MU between 1.6 and 3.3%. In other cases (such as total cholesterol, the U may span over a broader range, i.e. from 0.02 to 3.6%). With regard to the performance evaluation, the approach is often heterogenous, and it is difficult to compare the performance of laboratories running the same RMP for the same measurand if involved in more than one EQAS. The reference measurement services have been created to help laboratory professionals and manufacturers to implement the correct metrological traceability, and the JCTLM database is the only correct way to retrieve all the necessary important information to this end. Copyright © 2018

  15. The US EPA reference dose for methylmercury: sources of uncertainty

    International Nuclear Information System (INIS)

    Rice, D.C.

    2004-01-01

    The US Environmental Protection Agency (EPA) derived a reference dose for methylmercury in 2001, based on an extensive analysis by the National Research Council (NRC) of the National Academy of Sciences. The NRC performed benchmark dose analysis on a number of endpoints from three longitudinal prospective studies: the Seychelles Islands, the Faroe Islands, and the New Zealand studies. Adverse effects were reported in the latter two studies, but not in the Seychelles study. The NRC also performed an integrative analysis of all three studies. Dose conversion from cord blood or maternal hair mercury concentration was performed by EPA using a one-compartment pharmacokinetic model. A total uncertainty factor of 10 was applied for intrahuman pharmacokinetic and pharmacodynamic variability. There are numerous decisions made by the NRC/EPA that could greatly affect the value of the reference dose (RfD). Some of these include the choice of a linear model for the relationship between mercury body burden and neuropsychological performance, the choice of values of P 0 and the benchmark response, the use of the 'critical study/critical endpoint' approach in the interpretation of the maternal body burden that corresponds to the RfD, the use of central tendencies in a one-compartment pharmacokinetic model rather than the inclusion of the distributions of variables for the population of reproductive-age women, the assumption of unity for the ratio of fetal cord blood to maternal blood methylmercury concentrations, the choice of a total of 10 as an uncertainty factor, and the lack of dose-response analysis for other health effects such as cardiovascular disease. In addition, it may be argued that derivation of a RfD for methylmercury is inappropriate, given that there does not appear to be a threshold for adverse neuropsychological effects based on available data

  16. Practical low dose limits for passive personal dosemeters and the implications for uncertainties close to the limit of detection

    International Nuclear Information System (INIS)

    Gilvin, P. J.; Perks, C. A.

    2011-01-01

    Recent years have seen the increasing use of passive dosemeters that have high sensitivities and, in laboratory conditions, detection limits of <10 μSv. However, in real operational use the detection limits will be markedly higher, because a large fraction of the accrued dose will be due to natural background, and this must be subtracted in order to obtain the desired occupational dose. No matter how well known the natural background is, the measurement uncertainty on doses of a few tens of microsieverts will be large. Individual monitoring services need to recognise this and manage the expectations of their clients by providing sufficient information. (authors)

  17. Evaluating Sources of Risks in Large Engineering Projects: The Roles of Equivocality and Uncertainty

    Directory of Open Access Journals (Sweden)

    Leena Pekkinen

    2015-11-01

    Full Text Available Contemporary project risk management literature introduces uncertainty, i.e., the lack of information, as a fundamental basis of project risks. In this study the authors assert that equivocality, i.e., the existence of multiple and conflicting interpretations, can also serve as a basis of risks. With an in-depth empirical investigation of a large complex engineering project the authors identified risk sources having their bases in the situations where uncertainty or equivocality was the predominant attribute. The information processing theory proposes different managerial practices for risk management based on the sources of risks in uncertainty or equivocality.

  18. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty.

    Science.gov (United States)

    Lash, Timothy L

    2007-11-26

    The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is

  19. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty

    Directory of Open Access Journals (Sweden)

    Lash Timothy L

    2007-11-01

    qualitative description of study limitations. The latter approach is likely to lead to overconfidence regarding the potential for causal associations, whereas the former safeguards against such overinterpretations. Furthermore, such analyses, once programmed, allow rapid implementation of alternative assignments of probability distributions to the bias parameters, so elevate the plane of discussion regarding study bias from characterizing studies as "valid" or "invalid" to a critical and quantitative discussion of sources of uncertainty.

  20. The Nordic Seas carbon budget: Sources, sinks, and uncertainties

    OpenAIRE

    Jeansson, Emil; Olsen, Are; Eldevik, Tor; Skjelvan, Ingunn; Omar, Abdirahman M.; Lauvset, Siv K.; Nilsen, Jan Even Ø.; Bellerby, Richard G. J; Johannessen, Truls; Falck, Eva

    2011-01-01

    A carbon budget for the Nordic Seas is derived by combining recent inorganic carbon data from the CARINA database with relevant volume transports. Values of organic carbon in the Nordic Seas' water masses, the amount of carbon input from river runoff, and the removal through sediment burial are taken from the literature. The largest source of carbon to the Nordic Seas is the Atlantic Water that enters the area across the Greenland-Scotland Ridge; this is in particular true for the anthropogen...

  1. Risk newsboy: approach for addressing uncertainty in developing action levels and cleanup limits

    International Nuclear Information System (INIS)

    Cooke, Roger; MacDonell, Margaret

    2007-01-01

    Site cleanup decisions involve developing action levels and residual limits for key contaminants, to assure health protection during the cleanup period and into the long term. Uncertainty is inherent in the toxicity information used to define these levels, based on incomplete scientific knowledge regarding dose-response relationships across various hazards and exposures at environmentally relevant levels. This problem can be addressed by applying principles used to manage uncertainty in operations research, as illustrated by the newsboy dilemma. Each day a newsboy must balance the risk of buying more papers than he can sell against the risk of not buying enough. Setting action levels and cleanup limits involves a similar concept of balancing and distributing risks and benefits in the face of uncertainty. The newsboy approach can be applied to develop health-based target concentrations for both radiological and chemical contaminants, with stakeholder input being crucial to assessing 'regret' levels. Associated tools include structured expert judgment elicitation to quantify uncertainty in the dose-response relationship, and mathematical techniques such as probabilistic inversion and iterative proportional fitting. (authors)

  2. Source terms: an investigation of uncertainties, magnitudes, and recommendations for research. [PWR; BWR

    Energy Technology Data Exchange (ETDEWEB)

    Levine, S.; Kaiser, G. D.; Arcieri, W. C.; Firstenberg, H.; Fulford, P. J.; Lam, P. S.; Ritzman, R. L.; Schmidt, E. R.

    1982-03-01

    The purpose of this document is to assess the state of knowledge and expert opinions that exist about fission product source terms from potential nuclear power plant accidents. This is so that recommendations can be made for research and analyses which have the potential to reduce the uncertainties in these estimated source terms and to derive improved methods for predicting their magnitudes. The main reasons for writing this report are to indicate the major uncertainties involved in defining realistic source terms that could arise from severe reactor accidents, to determine which factors would have the most significant impact on public risks and emergency planning, and to suggest research and analyses that could result in the reduction of these uncertainties. Source terms used in the conventional consequence calculations in the licensing process are not explicitly addressed.

  3. External technology sourcing : The effect of uncertainty on governance mode choice

    NARCIS (Netherlands)

    van de Vrande, V.; Vanhaverbeke, W.P.M.; Duijsters, G.M.

    2009-01-01

    External knowledge sourcing is increasingly important for corporate entrepreneurship. In this study, we examine the effect of external and relational uncertainty on the governance choice for inter-organizational technology sourcing. We develop a number of hypotheses about the impact of environmental

  4. Neutron activation analysis detection limits using 252Cf sources

    International Nuclear Information System (INIS)

    DiPrete, D.P.; Sigg, R.A.

    2000-01-01

    The Savannah River Technology Center (SRTC) developed a neutron activation analysis (NAA) facility several decades ago using low-flux 252 Cf neutron sources. Through this time, the facility has addressed areas of applied interest in managing the Savannah River Site (SRS). Some applications are unique because of the site's operating history and its chemical-processing facilities. Because sensitivity needs for many applications are not severe, they can be accomplished using an ∼6-mg 252 Cf NAA facility. The SRTC 252 Cf facility continues to support applied research programs at SRTC as well as other SRS programs for environmental and waste management customers. Samples analyzed by NAA include organic compounds, metal alloys, sediments, site process solutions, and many other materials. Numerous radiochemical analyses also rely on the facility for production of short-lived tracers, yielding by activation of carriers and small-scale isotope production for separation methods testing. These applications are more fully reviewed in Ref. 1. Although the flux [approximately2 x 10 7 n/cm 2 ·s] is low relative to reactor facilities, more than 40 elements can be detected at low and sub-part-per-million levels. Detection limits provided by the facility are adequate for many analytical projects. Other multielement analysis methods, particularly inductively coupled plasma atomic emission and inductively coupled plasma mass spectrometry, can now provide sensitivities on dissolved samples that are often better than those available by NAA using low-flux isotopic sources. Because NAA allows analysis of bulk samples, (a) it is a more cost-effective choice when its sensitivity is adequate than methods that require digestion and (b) it eliminates uncertainties that can be introduced by digestion processes

  5. The Uranie platform: an Open-source software for optimisation, meta-modelling and uncertainty analysis

    OpenAIRE

    Blanchard, J-B.; Damblin, G.; Martinez, J-M.; Arnaud, G.; Gaudier, F.

    2018-01-01

    The high-performance computing resources and the constant improvement of both numerical simulation accuracy and the experimental measurements with which they are confronted, bring a new compulsory step to strengthen the credence given to the simulation results: uncertainty quantification. This can have different meanings, according to the requested goals (rank uncertainty sources, reduce them, estimate precisely a critical threshold or an optimal working point) and it could request mathematic...

  6. Global sensitivity analysis in wastewater treatment plant model applications: Prioritizing sources of uncertainty

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2011-01-01

    This study demonstrates the usefulness of global sensitivity analysis in wastewater treatment plant (WWTP) design to prioritize sources of uncertainty and quantify their impact on performance criteria. The study, which is performed with the Benchmark Simulation Model no. 1 plant design, complements...... insight into devising useful ways for reducing uncertainties in the plant performance. This information can help engineers design robust WWTP plants....... a previous paper on input uncertainty characterisation and propagation (Sin et al., 2009). A sampling-based sensitivity analysis is conducted to compute standardized regression coefficients. It was found that this method is able to decompose satisfactorily the variance of plant performance criteria (with R2...

  7. Uncertainty principles for inverse source problems for electromagnetic and elastic waves

    Science.gov (United States)

    Griesmaier, Roland; Sylvester, John

    2018-06-01

    In isotropic homogeneous media, far fields of time-harmonic electromagnetic waves radiated by compactly supported volume currents, and elastic waves radiated by compactly supported body force densities can be modelled in very similar fashions. Both are projected restricted Fourier transforms of vector-valued source terms. In this work we generalize two types of uncertainty principles recently developed for far fields of scalar-valued time-harmonic waves in Griesmaier and Sylvester (2017 SIAM J. Appl. Math. 77 154–80) to this vector-valued setting. These uncertainty principles yield stability criteria and algorithms for splitting far fields radiated by collections of well-separated sources into the far fields radiated by individual source components, and for the restoration of missing data segments. We discuss proper regularization strategies for these inverse problems, provide stability estimates based on the new uncertainty principles, and comment on reconstruction schemes. A numerical example illustrates our theoretical findings.

  8. Introducing nonpoint source transferable quotas in nitrogen trading: The effects of transaction costs and uncertainty.

    Science.gov (United States)

    Zhou, Xiuru; Ye, Weili; Zhang, Bing

    2016-03-01

    Transaction costs and uncertainty are considered to be significant obstacles in the emissions trading market, especially for including nonpoint source in water quality trading. This study develops a nonlinear programming model to simulate how uncertainty and transaction costs affect the performance of point/nonpoint source (PS/NPS) water quality trading in the Lake Tai watershed, China. The results demonstrate that PS/NPS water quality trading is a highly cost-effective instrument for emissions abatement in the Lake Tai watershed, which can save 89.33% on pollution abatement costs compared to trading only between nonpoint sources. However, uncertainty can significantly reduce the cost-effectiveness by reducing trading volume. In addition, transaction costs from bargaining and decision making raise total pollution abatement costs directly and cause the offset system to deviate from the optimal state. While proper investment in monitoring and measuring of nonpoint emissions can decrease uncertainty and save on the total abatement costs. Finally, we show that the dispersed ownership of China's farmland will bring high uncertainty and transaction costs into the PS/NPS offset system, even if the pollution abatement cost is lower than for point sources. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.

    1991-03-01

    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  10. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  11. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.

    1999-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  12. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    International Nuclear Information System (INIS)

    Meyer D, Philip; Gee W, Glendon

    2000-01-01

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases

  13. Governing Laws of Complex System Predictability under Co-evolving Uncertainty Sources: Theory and Nonlinear Geophysical Applications

    Science.gov (United States)

    Perdigão, R. A. P.

    2017-12-01

    Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.

  14. Determination of the reference air kerma rate for 192Ir brachytherapy sources and the related uncertainty

    International Nuclear Information System (INIS)

    Dijk, Eduard van; Kolkman-Deurloo, Inger-Karine K.; Damen, Patricia M. G.

    2004-01-01

    Different methods exist to determine the air kerma calibration factor of an ionization chamber for the spectrum of a 192 Ir high-dose-rate (HDR) or pulsed-dose-rate (PDR) source. An analysis of two methods to obtain such a calibration factor was performed: (i) the method recommended by [Goetsch et al., Med. Phys. 18, 462-467 (1991)] and (ii) the method employed by the Dutch national standards institute NMi [Petersen et al., Report S-EI-94.01 (NMi, Delft, The Netherlands, 1994)]. This analysis showed a systematic difference on the order of 1% in the determination of the strength of 192 Ir HDR and PDR sources depending on the method used for determining the air kerma calibration factor. The definitive significance of the difference between these methods can only be addressed after performing an accurate analysis of the associated uncertainties. For an NE 2561 (or equivalent) ionization chamber and an in-air jig, a typical uncertainty budget of 0.94% was found with the NMi method. The largest contribution in the type-B uncertainty is the uncertainty in the air kerma calibration factor for isotope i, N k i , as determined by the primary or secondary standards laboratories. This uncertainty is dominated by the uncertainties in the physical constants for the average mass-energy absorption coefficient ratio and the stopping power ratios. This means that it is not foreseeable that the standards laboratories can decrease the uncertainty in the air kerma calibration factors for ionization chambers in the short term. When the results of the determination of the 192 Ir reference air kerma rates in, e.g., different institutes are compared, the uncertainties in the physical constants are the same. To compare the applied techniques, the ratio of the results can be judged by leaving out the uncertainties due to these physical constants. In that case an uncertainty budget of 0.40% (coverage factor=2) should be taken into account. Due to the differences in approach between the

  15. Source Data Impacts on Epistemic Uncertainty for Launch Vehicle Fault Tree Models

    Science.gov (United States)

    Al Hassan, Mohammad; Novack, Steven; Ring, Robert

    2016-01-01

    Launch vehicle systems are designed and developed using both heritage and new hardware. Design modifications to the heritage hardware to fit new functional system requirements can impact the applicability of heritage reliability data. Risk estimates for newly designed systems must be developed from generic data sources such as commercially available reliability databases using reliability prediction methodologies, such as those addressed in MIL-HDBK-217F. Failure estimates must be converted from the generic environment to the specific operating environment of the system in which it is used. In addition, some qualification of applicability for the data source to the current system should be made. Characterizing data applicability under these circumstances is crucial to developing model estimations that support confident decisions on design changes and trade studies. This paper will demonstrate a data-source applicability classification method for suggesting epistemic component uncertainty to a target vehicle based on the source and operating environment of the originating data. The source applicability is determined using heuristic guidelines while translation of operating environments is accomplished by applying statistical methods to MIL-HDK-217F tables. The paper will provide one example for assigning environmental factors uncertainty when translating between operating environments for the microelectronic part-type components. The heuristic guidelines will be followed by uncertainty-importance routines to assess the need for more applicable data to reduce model uncertainty.

  16. Uncertainty Quantification given Discontinuous Climate Model Response and a Limited Number of Model Runs

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Debusschere, B.; Najm, H.

    2010-12-01

    Uncertainty quantification in complex climate models is challenged by the sparsity of available climate model predictions due to the high computational cost of model runs. Another feature that prevents classical uncertainty analysis from being readily applicable is bifurcative behavior in climate model response with respect to certain input parameters. A typical example is the Atlantic Meridional Overturning Circulation. The predicted maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We outline a methodology for uncertainty quantification given discontinuous model response and a limited number of model runs. Our approach is two-fold. First we detect the discontinuity with Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve shape and location for arbitrarily distributed input parameter values. Then, we construct spectral representations of uncertainty, using Polynomial Chaos (PC) expansions on either side of the discontinuity curve, leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification. The approach is enabled by a Rosenblatt transformation that maps each side of the discontinuity to regular domains where desirable orthogonality properties for the spectral bases hold. We obtain PC modes by either orthogonal projection or Bayesian inference, and argue for a hybrid approach that targets a balance between the accuracy provided by the orthogonal projection and the flexibility provided by the Bayesian inference - where the latter allows obtaining reasonable expansions without extra forward model runs. The model output, and its associated uncertainty at specific design points, are then computed by taking an ensemble average over PC expansions corresponding to possible realizations of the discontinuity curve. The methodology is tested on synthetic examples of

  17. Experiencing a probabilistic approach to clarify and disclose uncertainties when setting occupational exposure limits.

    Science.gov (United States)

    Vernez, David; Fraize-Frontier, Sandrine; Vincent, Raymond; Binet, Stéphane; Rousselle, Christophe

    2018-03-15

    Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  18. Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty

    Science.gov (United States)

    Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon

    2006-01-01

    Introduction: This report explores how uncertainty in an earthquake source model may affect estimates of earthquake economic loss. Specifically, it focuses on the earthquake source model for the San Francisco Bay region (SFBR) created by the Working Group on California Earthquake Probabilities. The loss calculations are made using HAZUS-MH, a publicly available computer program developed by the Federal Emergency Management Agency (FEMA) for calculating future losses from earthquakes, floods and hurricanes within the United States. The database built into HAZUS-MH includes a detailed building inventory, population data, data on transportation corridors, bridges, utility lifelines, etc. Earthquake hazard in the loss calculations is based upon expected (median value) ground motion maps called ShakeMaps calculated for the scenario earthquake sources defined in WGCEP. The study considers the effect of relaxing certain assumptions in the WG02 model, and explores the effect of hypothetical reductions in epistemic uncertainty in parts of the model. For example, it addresses questions such as what would happen to the calculated loss distribution if the uncertainty in slip rate in the WG02 model were reduced (say, by obtaining additional geologic data)? What would happen if the geometry or amount of aseismic slip (creep) on the region's faults were better known? And what would be the effect on the calculated loss distribution if the time-dependent earthquake probability were better constrained, either by eliminating certain probability models or by better constraining the inherent randomness in earthquake recurrence? The study does not consider the effect of reducing uncertainty in the hazard introduced through models of attenuation and local site characteristics, although these may have a comparable or greater effect than does source-related uncertainty. Nor does it consider sources of uncertainty in the building inventory, building fragility curves, and other assumptions

  19. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    Science.gov (United States)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics

  20. On quantifying uncertainty for project selection: the case of renewable energy sources' investment

    International Nuclear Information System (INIS)

    Kirytopoulos, Konstantinos; Rentizelas, Athanassios; Tziralis, Georgios

    2006-01-01

    The selection of a project among different alternatives, considering the limited resources of a company (organisation), is an added value process that determines the prosperity of an undertaken project (investment). This applies also to the 'boming' Renewable Energy Sector, especially under the circumstances established by the recent activation of the Kyoto protocal and by the plethora of available choices for renewable energy sources (RES) projjects. The need for a reliable project selection method among the various alternatives is, therefore, highlighted and, in this context, the paper proposes the NPV function as one of possible criteria for the selection of a RES project. Furthermore, it differentiates from the typical NPV calculation process by adding the concept of a probabilistic NPV approach through Monte Carlo simulation. Reality is non-deterministic, so any attempt of modelling it by using a deterministic approach is by definition erroneous. The paper ultimately proposes a process of substituting the point with a range estimation, capable of quantifying the various uncertainty factors and in this way elucidate the accomplishment possibilities of eligible scenarious. The paper is enhanced by case study showing how the proposed method can be practically applied to support the investment decision, thus enabling the decision makers to judge its effectiveness and usefulness.(Author)

  1. Fundamental limits of radio interferometers: calibration and source parameter estimation

    OpenAIRE

    Trott, Cathryn M.; Wayth, Randall B.; Tingay, Steven J.

    2012-01-01

    We use information theory to derive fundamental limits on the capacity to calibrate next-generation radio interferometers, and measure parameters of point sources for instrument calibration, point source subtraction, and data deconvolution. We demonstrate the implications of these fundamental limits, with particular reference to estimation of the 21cm Epoch of Reionization power spectrum with next-generation low-frequency instruments (e.g., the Murchison Widefield Array -- MWA, Precision Arra...

  2. Estimating uncertainty in subsurface glider position using transmissions from fixed acoustic tomography sources.

    Science.gov (United States)

    Van Uffelen, Lora J; Nosal, Eva-Marie; Howe, Bruce M; Carter, Glenn S; Worcester, Peter F; Dzieciuch, Matthew A; Heaney, Kevin D; Campbell, Richard L; Cross, Patrick S

    2013-10-01

    Four acoustic Seagliders were deployed in the Philippine Sea November 2010 to April 2011 in the vicinity of an acoustic tomography array. The gliders recorded over 2000 broadband transmissions at ranges up to 700 km from moored acoustic sources as they transited between mooring sites. The precision of glider positioning at the time of acoustic reception is important to resolve the fundamental ambiguity between position and sound speed. The Seagliders utilized GPS at the surface and a kinematic model below for positioning. The gliders were typically underwater for about 6.4 h, diving to depths of 1000 m and traveling on average 3.6 km during a dive. Measured acoustic arrival peaks were unambiguously associated with predicted ray arrivals. Statistics of travel-time offsets between received arrivals and acoustic predictions were used to estimate range uncertainty. Range (travel time) uncertainty between the source and the glider position from the kinematic model is estimated to be 639 m (426 ms) rms. Least-squares solutions for glider position estimated from acoustically derived ranges from 5 sources differed by 914 m rms from modeled positions, with estimated uncertainty of 106 m rms in horizontal position. Error analysis included 70 ms rms of uncertainty due to oceanic sound-speed variability.

  3. Reliability of Coulomb stress changes inferred from correlated uncertainties of finite-fault source models

    KAUST Repository

    Woessner, J.

    2012-07-14

    Static stress transfer is one physical mechanism to explain triggered seismicity. Coseismic stress-change calculations strongly depend on the parameterization of the causative finite-fault source model. These models are uncertain due to uncertainties in input data, model assumptions, and modeling procedures. However, fault model uncertainties have usually been ignored in stress-triggering studies and have not been propagated to assess the reliability of Coulomb failure stress change (ΔCFS) calculations. We show how these uncertainties can be used to provide confidence intervals for co-seismic ΔCFS-values. We demonstrate this for the MW = 5.9 June 2000 Kleifarvatn earthquake in southwest Iceland and systematically map these uncertainties. A set of 2500 candidate source models from the full posterior fault-parameter distribution was used to compute 2500 ΔCFS maps. We assess the reliability of the ΔCFS-values from the coefficient of variation (CV) and deem ΔCFS-values to be reliable where they are at least twice as large as the standard deviation (CV ≤ 0.5). Unreliable ΔCFS-values are found near the causative fault and between lobes of positive and negative stress change, where a small change in fault strike causes ΔCFS-values to change sign. The most reliable ΔCFS-values are found away from the source fault in the middle of positive and negative ΔCFS-lobes, a likely general pattern. Using the reliability criterion, our results support the static stress-triggering hypothesis. Nevertheless, our analysis also suggests that results from previous stress-triggering studies not considering source model uncertainties may have lead to a biased interpretation of the importance of static stress-triggering.

  4. Shape optimization of an airfoil in a BZT flow with multiple-source uncertainties

    International Nuclear Information System (INIS)

    Congedo, P.M.; Corre, C.; Martinez, J.M.

    2011-01-01

    Bethe-Zel'dovich-Thompson fluids (BZT) are characterized by negative values of the fundamental derivative of gas dynamics for a range of temperatures and pressures in the vapor phase, which leads to non-classical gas dynamic behaviors such as the disintegration of compression shocks. These non-classical phenomena can be exploited, when using these fluids in Organic Rankine Cycles (ORCs), to increase isentropic efficiency. A predictive numerical simulation of these flows must account for two main sources of physical uncertainties: the BZT fluid properties often difficult to measure accurately and the usually fluctuating turbine inlet conditions. For taking full advantage of the BZT properties, the turbine geometry must also be specifically designed, keeping in mind the geometry achieved in practice after machining always slightly differs from the theoretical shape. This paper investigates some efficient procedures to perform shape optimization in a 2D BZT flow with multiple-source uncertainties (thermodynamic model, operating conditions and geometry). To demonstrate the feasibility of the proposed efficient strategies for shape optimization in the presence of multiple-source uncertainties, a zero incidence symmetric airfoil wave-drag minimization problem is retained as a case-study. This simplified configuration encompasses most of the features associated with a turbine design problem, as far the uncertainty quantification is concerned. A preliminary analysis of the contributions to the variance of the wave-drag allows to select the most significant sources of uncertainties using a reduced number of flow computations. The resulting mean value and variance of the objective are next turned into meta models. The optimal Pareto sets corresponding to the minimization of various substitute functions are obtained using a genetic algorithm as optimizer and their differences are discussed. (authors)

  5. Sequential planning of flood protection infrastructure under limited historic flood record and climate change uncertainty

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Straub, Daniel

    2017-04-01

    Flood protection is often designed to safeguard people and property following regulations and standards, which specify a target design flood protection level, such as the 100-year flood level prescribed in Germany (DWA, 2011). In practice, the magnitude of such an event is only known within a range of uncertainty, which is caused by limited historic records and uncertain climate change impacts, among other factors (Hall & Solomatine, 2008). As more observations and improved climate projections become available in the future, the design flood estimate changes and the capacity of the flood protection may be deemed insufficient at a future point in time. This problem can be mitigated by the implementation of flexible flood protection systems (that can easily be adjusted in the future) and/or by adding an additional reserve to the flood protection, i.e. by applying a safety factor to the design. But how high should such a safety factor be? And how much should the decision maker be willing to pay to make the system flexible, i.e. what is the Value of Flexibility (Špačková & Straub, 2017)? We propose a decision model that identifies cost-optimal decisions on flood protection capacity in the face of uncertainty (Dittes et al. 2017). It considers sequential adjustments of the protection system during its lifetime, taking into account its flexibility. The proposed framework is based on pre-posterior Bayesian decision analysis, using Decision Trees and Markov Decision Processes, and is fully quantitative. It can include a wide range of uncertainty components such as uncertainty associated with limited historic record or uncertain climate or socio-economic change. It is shown that since flexible systems are less costly to adjust when flood estimates are changing, they justify initially lower safety factors. Investigation on the Value of Flexibility (VoF) demonstrates that VoF depends on the type and degree of uncertainty, on the learning effect (i.e. kind and quality of

  6. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    Science.gov (United States)

    Deffner, Sebastian; Campbell, Steve

    2017-11-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam-Tamm and the Margolus-Levitin bounds on the quantum speed limit, and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach, where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader.

  7. Quantum speed limits: from Heisenberg’s uncertainty principle to optimal quantum control

    International Nuclear Information System (INIS)

    Deffner, Sebastian; Campbell, Steve

    2017-01-01

    One of the most widely known building blocks of modern physics is Heisenberg’s indeterminacy principle. Among the different statements of this fundamental property of the full quantum mechanical nature of physical reality, the uncertainty relation for energy and time has a special place. Its interpretation and its consequences have inspired continued research efforts for almost a century. In its modern formulation, the uncertainty relation is understood as setting a fundamental bound on how fast any quantum system can evolve. In this topical review we describe important milestones, such as the Mandelstam–Tamm and the Margolus–Levitin bounds on the quantum speed limit , and summarise recent applications in a variety of current research fields—including quantum information theory, quantum computing, and quantum thermodynamics amongst several others. To bring order and to provide an access point into the many different notions and concepts, we have grouped the various approaches into the minimal time approach and the geometric approach , where the former relies on quantum control theory, and the latter arises from measuring the distinguishability of quantum states. Due to the volume of the literature, this topical review can only present a snapshot of the current state-of-the-art and can never be fully comprehensive. Therefore, we highlight but a few works hoping that our selection can serve as a representative starting point for the interested reader. (topical review)

  8. Evaluation the sources of uncertainty associated to the measurement results of in vivo monitoring of iodine 131 in the thyroid

    International Nuclear Information System (INIS)

    Gontijo, Rodrigo Modesto Gadelha

    2011-01-01

    In vivo monitoring techniques consist of identification and quantification of radionuclides present in the whole body and specific organs and tissues. In Vivo monitoring requires the use of detedors which are sensitive to the radiation emitted by radionuclides present in the monitored individual. The results obtained in measurements may present small uncertainties which are within pre-set limits in monitoring programs for occupationally exposed individuais. However, any device used to determine physical quantities present uncertainties in the measured values. The total uncertainty of a measurement result is estimated from the propagation of the uncertainties associated to each parameter of the calculation. This study aims to evaluate the sources of uncertainty associated to the measurement results of in vivo monitoring of iodine 131 in the thyroid, in comparison to the suggested in the General Guide for Estimating Effective Doses from Monitoring Data (Project IDEAS/European Community). The reference values used were the ones for high-energy photons (>100 keV). The measurement uncertainties were divided into two categories: type A and type B. The component of type A represents the statistical fluctuation in the counting of the standard source. Regarding type B, the following variations were presented: detector positioning over the phantom; variation of background radiation; thickness of the overlay tissue over the monitored organ, distribution of the activity in the organ. Besides the parameters suggested by the IDEAS Guide, it has also been evaluated the fluctuation of the counting due to the phantom repositioning, which represents the reproducibility of the measurement geometry. Measurements were performed at the Whole Body Counter Unit of IRD using a scintillation detector Nal (Tl) 3 x3 and a neck-thyroid phantom developed at LABMIVIRD. Scattering factors were calculated and compared in different counting geometries. The results of this study show that the

  9. Application of a new methodology to evaluate Dnb limits based on statistical propagation of uncertainties

    International Nuclear Information System (INIS)

    Machado, Marcio Dornellas

    1998-09-01

    One of the most important thermalhydraulics safety parameters is the DNBR (Departure from Nucleate Boiling Ratio). The current methodology in use at Eletronuclear to determine DNBR is extremely conservative and may result in penalties to the reactor power due to an increase plugging level of steam generator tubes. This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard methodology (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRA IIC/MIT code, modified to Angra 1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which take most unfavorable values in the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author)

  10. An Open Source Computational Framework for Uncertainty Quantification of Plasma Chemistry Models

    OpenAIRE

    Zaheri Sarabi, Shadi

    2017-01-01

    The current thesis deals with the development of a computational framework for performing plasma chemistry simulations and their uncertainty quantification analysis by suitably combining and extending existing open source computational tools. A plasma chemistry solver is implemented in the OpenFOAM C++ solver suite. The OpenFOAM plasma chemistry application solves the species conservation equations and the electron energy equation by accounting suitably for various production and loss terms b...

  11. Hiding the Source Based on Limited Flooding for Sensor Networks.

    Science.gov (United States)

    Chen, Juan; Lin, Zhengkui; Hu, Ying; Wang, Bailing

    2015-11-17

    Wireless sensor networks are widely used to monitor valuable objects such as rare animals or armies. Once an object is detected, the source, i.e., the sensor nearest to the object, generates and periodically sends a packet about the object to the base station. Since attackers can capture the object by localizing the source, many protocols have been proposed to protect source location. Instead of transmitting the packet to the base station directly, typical source location protection protocols first transmit packets randomly for a few hops to a phantom location, and then forward the packets to the base station. The problem with these protocols is that the generated phantom locations are usually not only near the true source but also close to each other. As a result, attackers can easily trace a route back to the source from the phantom locations. To address the above problem, we propose a new protocol for source location protection based on limited flooding, named SLP. Compared with existing protocols, SLP can generate phantom locations that are not only far away from the source, but also widely distributed. It improves source location security significantly with low communication cost. We further propose a protocol, namely SLP-E, to protect source location against more powerful attackers with wider fields of vision. The performance of our SLP and SLP-E are validated by both theoretical analysis and simulation results.

  12. Hiding the Source Based on Limited Flooding for Sensor Networks

    Directory of Open Access Journals (Sweden)

    Juan Chen

    2015-11-01

    Full Text Available Wireless sensor networks are widely used to monitor valuable objects such as rare animals or armies. Once an object is detected, the source, i.e., the sensor nearest to the object, generates and periodically sends a packet about the object to the base station. Since attackers can capture the object by localizing the source, many protocols have been proposed to protect source location. Instead of transmitting the packet to the base station directly, typical source location protection protocols first transmit packets randomly for a few hops to a phantom location, and then forward the packets to the base station. The problem with these protocols is that the generated phantom locations are usually not only near the true source but also close to each other. As a result, attackers can easily trace a route back to the source from the phantom locations. To address the above problem, we propose a new protocol for source location protection based on limited flooding, named SLP. Compared with existing protocols, SLP can generate phantom locations that are not only far away from the source, but also widely distributed. It improves source location security significantly with low communication cost. We further propose a protocol, namely SLP-E, to protect source location against more powerful attackers with wider fields of vision. The performance of our SLP and SLP-E are validated by both theoretical analysis and simulation results.

  13. Energy saving in WWTP: Daily benchmarking under uncertainty and data availability limitations.

    Science.gov (United States)

    Torregrossa, D; Schutz, G; Cornelissen, A; Hernández-Sancho, F; Hansen, J

    2016-07-01

    Efficient management of Waste Water Treatment Plants (WWTPs) can produce significant environmental and economic benefits. Energy benchmarking can be used to compare WWTPs, identify targets and use these to improve their performance. Different authors have performed benchmark analysis on monthly or yearly basis but their approaches suffer from a time lag between an event, its detection, interpretation and potential actions. The availability of on-line measurement data on many WWTPs should theoretically enable the decrease of the management response time by daily benchmarking. Unfortunately this approach is often impossible because of limited data availability. This paper proposes a methodology to perform a daily benchmark analysis under database limitations. The methodology has been applied to the Energy Online System (EOS) developed in the framework of the project "INNERS" (INNovative Energy Recovery Strategies in the urban water cycle). EOS calculates a set of Key Performance Indicators (KPIs) for the evaluation of energy and process performances. In EOS, the energy KPIs take in consideration the pollutant load in order to enable the comparison between different plants. For example, EOS does not analyse the energy consumption but the energy consumption on pollutant load. This approach enables the comparison of performances for plants with different loads or for a single plant under different load conditions. The energy consumption is measured by on-line sensors, while the pollutant load is measured in the laboratory approximately every 14 days. Consequently, the unavailability of the water quality parameters is the limiting factor in calculating energy KPIs. In this paper, in order to overcome this limitation, the authors have developed a methodology to estimate the required parameters and manage the uncertainty in the estimation. By coupling the parameter estimation with an interval based benchmark approach, the authors propose an effective, fast and reproducible

  14. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  15. Investing in Uncertainty: Young Adults with Life-Limiting Conditions Achieving Their Developmental Goals.

    Science.gov (United States)

    Cook, Karen A; Jack, Susan M; Siden, Hal; Thabane, Lehana; Browne, Gina

    2016-08-01

    With improvements in pediatric care and technology, more young adults (YAs) with life-limiting conditions (LLCs) are surviving into adulthood. However, they have limited expectations to live beyond the first decade of adulthood. This study describes the monumental efforts required for YAs with LLCs to achieve their goals in an abbreviated life. The experiences and aspirations of YAs with LLCs to achieve their goals are relatively unknown. This report focuses on their experiences of living with uncertainty and its impact on achieving developmental goals. This study is one component of a larger descriptive study using an innovative bulletin board focus group to examine life experiences of YAs with LLCs. YAs with LLCs share the aspirations and goals of all YAs. Some participants demonstrated a striking capacity to navigate system barriers and achieve their goals, whereas others "got stuck" resulting in lost opportunities. Successful personal life investments were possible if resources were made available, coordinated, navigable, and responsive to new and special requests. Transformative changes to health, social care, and community services are necessary to support their YA ambitions. This study gave voice to those who were previously unheard and demonstrates the monumental hurdles YAs with LLCs face to achieve their goals. A palliative approach to care can mitigate unnecessary hardships and support their goals.

  16. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    Science.gov (United States)

    Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.

    2012-12-01

    Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed

  17. New Insights on the Uncertainties in Finite-Fault Earthquake Source Inversion

    KAUST Repository

    Razafindrakoto, Hoby

    2015-04-01

    New Insights on the Uncertainties in Finite-Fault Earthquake Source Inversion Hoby Njara Tendrisoa Razafindrakoto Earthquake source inversion is a non-linear problem that leads to non-unique solutions. The aim of this dissertation is to understand the uncertainty and reliability in earthquake source inversion, as well as to quantify variability in earthquake rupture models. The source inversion is performed using a Bayesian inference. This technique augments optimization approaches through its ability to image the entire solution space which is consistent with the data and prior information. In this study, the uncertainty related to the choice of source-time function and crustal structure is investigated. Three predefined analytical source-time functions are analyzed; isosceles triangle, Yoffe with acceleration time of 0.1 and 0.3 s. The use of the isosceles triangle as source-time function is found to bias the finite-fault source inversion results. It accelerates the rupture to propagate faster compared to that of the Yoffe function. Moreover, it generates an artificial linear correlation between parameters that does not exist for the Yoffe source-time functions. The effect of inadequate knowledge of Earth’s crustal structure in earthquake rupture models is subsequently investigated. The results show that one-dimensional structure variability leads to parameters resolution changes, with a broadening of the posterior 5 PDFs and shifts in the peak location. These changes in the PDFs of kinematic parameters are associated with the blurring effect of using incorrect Earth structure. As an application to real earthquake, finite-fault source models for the 2009 L’Aquila earthquake are examined using one- and three-dimensional crustal structures. One- dimensional structure is found to degrade the data fitting. However, there is no significant effect on the rupture parameters aside from differences in the spatial slip extension. Stable features are maintained for both

  18. Fundamental limits on beam stability at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Decker, G. A.

    1998-01-01

    Orbit correction is now routinely performed at the few-micron level in the Advanced Photon Source (APS) storage ring. Three diagnostics are presently in use to measure and control both AC and DC orbit motions: broad-band turn-by-turn rf beam position monitors (BPMs), narrow-band switched heterodyne receivers, and photoemission-style x-ray beam position monitors. Each type of diagnostic has its own set of systematic error effects that place limits on the ultimate pointing stability of x-ray beams supplied to users at the APS. Limiting sources of beam motion at present are magnet power supply noise, girder vibration, and thermal timescale vacuum chamber and girder motion. This paper will investigate the present limitations on orbit correction, and will delve into the upgrades necessary to achieve true sub-micron beam stability

  19. Kinematic source inversions of teleseismic data based on the QUESO library for uncertainty quantification and prediction

    Science.gov (United States)

    Zielke, O.; McDougall, D.; Mai, P. M.; Babuska, I.

    2014-12-01

    One fundamental aspect of seismic hazard mitigation is gaining a better understanding of the rupture process. Because direct observation of the relevant parameters and properties is not possible, other means such as kinematic source inversions are used instead. By constraining the spatial and temporal evolution of fault slip during an earthquake, those inversion approaches may enable valuable insights in the physics of the rupture process. However, due to the underdetermined nature of this inversion problem (i.e., inverting a kinematic source model for an extended fault based on seismic data), the provided solutions are generally non-unique. Here we present a statistical (Bayesian) inversion approach based on an open-source library for uncertainty quantification (UQ) called QUESO that was developed at ICES (UT Austin). The approach has advantages with respect to deterministic inversion approaches as it provides not only a single (non-unique) solution but also provides uncertainty bounds with it. Those uncertainty bounds help to qualitatively and quantitatively judge how well constrained an inversion solution is and how much rupture complexity the data reliably resolve. The presented inversion scheme uses only tele-seismically recorded body waves but future developments may lead us towards joint inversion schemes. After giving an insight in the inversion scheme ifself (based on delayed rejection adaptive metropolis, DRAM) we explore the method's resolution potential. For that, we synthetically generate tele-seismic data, add for example different levels of noise and/or change fault plane parameterization and then apply our inversion scheme in the attempt to extract the (known) kinematic rupture model. We conclude with exemplary inverting real tele-seismic data of a recent large earthquake and compare those results with deterministically derived kinematic source models provided by other research groups.

  20. Quantifying the sources of uncertainty in an ensemble of hydrological climate-impact projections

    Science.gov (United States)

    Aryal, Anil; Shrestha, Sangam; Babel, Mukand S.

    2018-01-01

    The objective of this paper is to quantify the various sources of uncertainty in the assessment of climate change impact on hydrology in the Tamakoshi River Basin, located in the north-eastern part of Nepal. Multiple climate and hydrological models were used to simulate future climate conditions and discharge in the basin. The simulated results of future climate and river discharge were analysed for the quantification of sources of uncertainty using two-way and three-way ANOVA. The results showed that temperature and precipitation in the study area are projected to change in near- (2010-2039), mid- (2040-2069) and far-future (2070-2099) periods. Maximum temperature is likely to rise by 1.75 °C under Representative Concentration Pathway (RCP) 4.5 and by 3.52 °C under RCP 8.5. Similarly, the minimum temperature is expected to rise by 2.10 °C under RCP 4.5 and by 3.73 °C under RCP 8.5 by the end of the twenty-first century. Similarly, the precipitation in the study area is expected to change by - 2.15% under RCP 4.5 and - 2.44% under RCP 8.5 scenarios. The future discharge in the study area was projected using two hydrological models, viz. Soil and Water Assessment Tool (SWAT) and Hydrologic Engineering Center's Hydrologic Modelling System (HEC-HMS). The SWAT model projected discharge is expected to change by small amount, whereas HEC-HMS model projected considerably lower discharge in future compared to the baseline period. The results also show that future climate variables and river hydrology contain uncertainty due to the choice of climate models, RCP scenarios, bias correction methods and hydrological models. During wet days, more uncertainty is observed due to the use of different climate models, whereas during dry days, the use of different hydrological models has a greater effect on uncertainty. Inter-comparison of the impacts of different climate models reveals that the REMO climate model shows higher uncertainty in the prediction of precipitation and

  1. Multi-photon absorption limits to heralded single photon sources

    Science.gov (United States)

    Husko, Chad A.; Clark, Alex S.; Collins, Matthew J.; De Rossi, Alfredo; Combrié, Sylvain; Lehoucq, Gaëlle; Rey, Isabella H.; Krauss, Thomas F.; Xiong, Chunle; Eggleton, Benjamin J.

    2013-01-01

    Single photons are of paramount importance to future quantum technologies, including quantum communication and computation. Nonlinear photonic devices using parametric processes offer a straightforward route to generating photons, however additional nonlinear processes may come into play and interfere with these sources. Here we analyse spontaneous four-wave mixing (SFWM) sources in the presence of multi-photon processes. We conduct experiments in silicon and gallium indium phosphide photonic crystal waveguides which display inherently different nonlinear absorption processes, namely two-photon (TPA) and three-photon absorption (ThPA), respectively. We develop a novel model capturing these diverse effects which is in excellent quantitative agreement with measurements of brightness, coincidence-to-accidental ratio (CAR) and second-order correlation function g(2)(0), showing that TPA imposes an intrinsic limit on heralded single photon sources. We build on these observations to devise a new metric, the quantum utility (QMU), enabling further optimisation of single photon sources. PMID:24186400

  2. Combining historical eyewitness accounts on tsunami-induced waves and numerical simulations for getting insights in uncertainty of source parameters

    Science.gov (United States)

    Rohmer, Jeremy; Rousseau, Marie; Lemoine, Anne; Pedreros, Rodrigo; Lambert, Jerome; benki, Aalae

    2017-04-01

    Recent tsunami events including the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami have caused many casualties and damages to structures. Advances in numerical simulation of tsunami-induced wave processes have tremendously improved forecast, hazard and risk assessment and design of early warning for tsunamis. Among the major challenges, several studies have underlined uncertainties in earthquake slip distributions and rupture processes as major contributor on tsunami wave height and inundation extent. Constraining these uncertainties can be performed by taking advantage of observations either on tsunami waves (using network of water level gauge) or on inundation characteristics (using field evidence and eyewitness accounts). Despite these successful applications, combining tsunami observations and simulations still faces several limitations when the problem is addressed for past tsunamis events like 1755 Lisbon. 1) While recent inversion studies can benefit from current modern networks (e.g., tide gauges, sea bottom pressure gauges, GPS-mounted buoys), the number of tide gauges can be very scarce and testimonies on tsunami observations can be limited, incomplete and imprecise for past tsunamis events. These observations often restrict to eyewitness accounts on wave heights (e.g., maximum reached wave height at the coast) instead of the full observed waveforms; 2) Tsunami phenomena involve a large span of spatial scales (from ocean basin scales to local coastal wave interactions), which can make the modelling very demanding: the computation time cost of tsunami simulation can be very prohibitive; often reaching several hours. This often limits the number of allowable long-running simulations for performing the inversion, especially when the problem is addressed from a Bayesian inference perspective. The objective of the present study is to overcome both afore-described difficulties in the view to combine historical observations on past tsunami-induced waves

  3. Investigation of source position uncertainties & balloon deformation in MammoSite brachytherapy on treatment effectiveness

    International Nuclear Information System (INIS)

    Bensaleh, S.

    2010-01-01

    The MammoSite ® breast high dose rate brachytherapy is used in treatment of early-stage breast cancer. The tumour bed volume is irradiated with high dose per fraction in a relatively small number of fractions. Uncertainties in the source positioning and MammoSite balloon deformation will alter the prescribed dose within the treated volume. They may also expose the normal tissues in balloon proximity to excessive dose. The purpose of this work is to explore the impact of these two uncertainties on the MammoSite dose distribution in the breast using dose volume histograms and Monte Carlo simulations. The Lyman–Kutcher and relative seriality models were employed to estimate the normal tissues complications associated with the MammoSite dose distributions. The tumour control probability was calculated using the Poisson model. This study gives low probabilities for developing heart and lung complications. The probability of complications of the skin and normal breast tissues depends on the location of the source inside the balloon and the volume receiving high dose. Incorrect source position and balloon deformation had significant effect on the prescribed dose within the treated volume. A 4 mm balloon deformation resulted in reduction of the tumour control probability by 24%. Monte Carlo calculations using EGSnrc showed that a deviation of the source by 1 mm caused approximately 7% dose reduction in the treated target volume at 1 cm from the balloon surface. In conclusion, accurate positioning of the 192 Ir source at the balloon centre and minimal balloon deformation are critical for proper dose delivery with the MammoSite brachytherapy applicator. On the basis of this study, we suggest that the MammoSite treatment protocols should allow for a balloon deformation of ≤2 mm and a maximum source deviation of ≤1 mm.

  4. Funnel plot control limits to identify poorly performing healthcare providers when there is uncertainty in the value of the benchmark.

    Science.gov (United States)

    Manktelow, Bradley N; Seaton, Sarah E; Evans, T Alun

    2016-12-01

    There is an increasing use of statistical methods, such as funnel plots, to identify poorly performing healthcare providers. Funnel plots comprise the construction of control limits around a benchmark and providers with outcomes falling outside the limits are investigated as potential outliers. The benchmark is usually estimated from observed data but uncertainty in this estimate is usually ignored when constructing control limits. In this paper, the use of funnel plots in the presence of uncertainty in the value of the benchmark is reviewed for outcomes from a Binomial distribution. Two methods to derive the control limits are shown: (i) prediction intervals; (ii) tolerance intervals Tolerance intervals formally include the uncertainty in the value of the benchmark while prediction intervals do not. The probability properties of 95% control limits derived using each method were investigated through hypothesised scenarios. Neither prediction intervals nor tolerance intervals produce funnel plot control limits that satisfy the nominal probability characteristics when there is uncertainty in the value of the benchmark. This is not necessarily to say that funnel plots have no role to play in healthcare, but that without the development of intervals satisfying the nominal probability characteristics they must be interpreted with care. © The Author(s) 2014.

  5. A risk-based evaluation of the impact of key uncertainties on the prediction of severe accident source terms - STU

    International Nuclear Information System (INIS)

    Ang, M.L.; Grindon, E.; Dutton, L.M.C.; Garcia-Sedano, P.; Santamaria, C.S.; Centner, B.; Auglaire, M.; Routamo, T.; Outa, S.; Jokiniemi, J.; Gustavsson, V.; Wennerstrom, H.; Spanier, L.; Gren, M.; Boschiero, M-H; Droulas, J-L; Friederichs, H-G; Sonnenkalb, M.

    2001-01-01

    The purpose of this project is to address the key uncertainties associated with a number of fission product release and transport phenomena in a wider context and to assess their relevance to key severe accident sequences. This project is a wide-based analysis involving eight reactor designs that are representative of the reactors currently operating in the European Union (EU). In total, 20 accident sequences covering a wide range of conditions have been chosen to provide the basis for sensitivity studies. The appraisal is achieved through a systematic risk-based framework developed within this project. Specifically, this is a quantitative interpretation of the sensitivity calculations on the basis of 'significance indicators', applied above defined threshold values. These threshold values represent a good surrogate for 'large release', which is defined in a number of EU countries. In addition, the results are placed in the context of in-containment source term limits, for advanced light water reactor designs, as defined by international guidelines. Overall, despite the phenomenological uncertainties, the predicted source terms (both into the containment, and subsequently, into the environment) do not display a high degree of sensitivity to the individual fission product issues addressed in this project. This is due, mainly, to the substantial capacity for the attenuation of airborne fission products by the designed safety provisions and the natural fission product retention mechanisms within the containment

  6. Uncertainties in the 2004 Sumatra–Andaman source through nonlinear stochastic inversion of tsunami waves

    Science.gov (United States)

    Venugopal, M.; Roy, D.; Rajendran, K.; Guillas, S.; Dias, F.

    2017-01-01

    Numerical inversions for earthquake source parameters from tsunami wave data usually incorporate subjective elements to stabilize the search. In addition, noisy and possibly insufficient data result in instability and non-uniqueness in most deterministic inversions, which are barely acknowledged. Here, we employ the satellite altimetry data for the 2004 Sumatra–Andaman tsunami event to invert the source parameters. We also include kinematic parameters that improve the description of tsunami generation and propagation, especially near the source. Using a finite fault model that represents the extent of rupture and the geometry of the trench, we perform a new type of nonlinear joint inversion of the slips, rupture velocities and rise times with minimal a priori constraints. Despite persistently good waveform fits, large uncertainties in the joint parameter distribution constitute a remarkable feature of the inversion. These uncertainties suggest that objective inversion strategies should incorporate more sophisticated physical models of seabed deformation in order to significantly improve the performance of early warning systems. PMID:28989311

  7. Characterization of sealed radioactive sources. Uncertainty analysis to improve detection methods

    International Nuclear Information System (INIS)

    Cummings, D.G.; Sommers, J.D.; Adamic, M.L.; Jimenez, M.; Giglio, J.J.; Carney, K.P.

    2009-01-01

    A radioactive 137 Cs source has been analyzed for the radioactive parent 137 Cs and stable decay daughter 137 Ba. The ratio of the daughter to parent atoms is used to estimate the date when Cs was purified prior to source encapsulation (an 'age' since purification). The isotopes were analyzed by inductively coupled plasma mass spectrometry (ICP-MS) after chemical separation. In addition, Ba was analyzed by isotope dilution ICP-MS (ID-ICP-MS). A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantifying the effect the errors have on the 'age' determined. This paper reports an uncertainty analysis to identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the 'age' determination of sealed sources is presented. The results will be compared to the original work done using external standards to calibrate the ICP-MS instrument. (author)

  8. Volcano deformation source parameters estimated from InSAR: Sensitivities to uncertainties in seismic tomography

    Science.gov (United States)

    Masterlark, Timothy; Donovan, Theodore; Feigl, Kurt L.; Haney, Matt; Thurber, Clifford H.; Tung, Sui

    2016-01-01

    The eruption cycle of a volcano is controlled in part by the upward migration of magma. The characteristics of the magma flux produce a deformation signature at the Earth's surface. Inverse analyses use geodetic data to estimate strategic controlling parameters that describe the position and pressurization of a magma chamber at depth. The specific distribution of material properties controls how observed surface deformation translates to source parameter estimates. Seismic tomography models describe the spatial distributions of material properties that are necessary for accurate models of volcano deformation. This study investigates how uncertainties in seismic tomography models propagate into variations in the estimates of volcano deformation source parameters inverted from geodetic data. We conduct finite element model-based nonlinear inverse analyses of interferometric synthetic aperture radar (InSAR) data for Okmok volcano, Alaska, as an example. We then analyze the estimated parameters and their uncertainties to characterize the magma chamber. Analyses are performed separately for models simulating a pressurized chamber embedded in a homogeneous domain as well as for a domain having a heterogeneous distribution of material properties according to seismic tomography. The estimated depth of the source is sensitive to the distribution of material properties. The estimated depths for the homogeneous and heterogeneous domains are 2666 ± 42 and 3527 ± 56 m below mean sea level, respectively (99% confidence). A Monte Carlo analysis indicates that uncertainties of the seismic tomography cannot account for this discrepancy at the 99% confidence level. Accounting for the spatial distribution of elastic properties according to seismic tomography significantly improves the fit of the deformation model predictions and significantly influences estimates for parameters that describe the location of a pressurized magma chamber.

  9. Uncertainty of soil erosion modelling using open source high resolution and aggregated DEMs

    Directory of Open Access Journals (Sweden)

    Arun Mondal

    2017-05-01

    Full Text Available Digital Elevation Model (DEM is one of the important parameters for soil erosion assessment. Notable uncertainties are observed in this study while using three high resolution open source DEMs. The Revised Universal Soil Loss Equation (RUSLE model has been applied to analysis the assessment of soil erosion uncertainty using open source DEMs (SRTM, ASTER and CARTOSAT and their increasing grid space (pixel size from the actual. The study area is a part of the Narmada river basin in Madhya Pradesh state, which is located in the central part of India and the area covered 20,558 km2. The actual resolution of DEMs is 30 m and their increasing grid spaces are taken as 90, 150, 210, 270 and 330 m for this study. Vertical accuracy of DEMs has been assessed using actual heights of the sample points that have been taken considering planimetric survey based map (toposheet. Elevations of DEMs are converted to the same vertical datum from WGS 84 to MSL (Mean Sea Level, before the accuracy assessment and modelling. Results indicate that the accuracy of the SRTM DEM with the RMSE of 13.31, 14.51, and 18.19 m in 30, 150 and 330 m resolution respectively, is better than the ASTER and the CARTOSAT DEMs. When the grid space of the DEMs increases, the accuracy of the elevation and calculated soil erosion decreases. This study presents a potential uncertainty introduced by open source high resolution DEMs in the accuracy of the soil erosion assessment models. The research provides an analysis of errors in selecting DEMs using the original and increased grid space for soil erosion modelling.

  10. An inexact fuzzy two-stage stochastic model for quantifying the efficiency of nonpoint source effluent trading under uncertainty

    International Nuclear Information System (INIS)

    Luo, B.; Maqsood, I.; Huang, G.H.; Yin, Y.Y.; Han, D.J.

    2005-01-01

    Reduction of nonpoint source (NPS) pollution from agricultural lands is a major concern in most countries. One method to reduce NPS pollution is through land retirement programs. This method, however, may result in enormous economic costs especially when large sums of croplands need to be retired. To reduce the cost, effluent trading can be employed to couple with land retirement programs. However, the trading efforts can also become inefficient due to various uncertainties existing in stochastic, interval, and fuzzy formats in agricultural systems. Thus, it is desired to develop improved methods to effectively quantify the efficiency of potential trading efforts by considering those uncertainties. In this respect, this paper presents an inexact fuzzy two-stage stochastic programming model to tackle such problems. The proposed model can facilitate decision-making to implement trading efforts for agricultural NPS pollution reduction through land retirement programs. The applicability of the model is demonstrated through a hypothetical effluent trading program within a subcatchment of the Lake Tai Basin in China. The study results indicate that the efficiency of the trading program is significantly influenced by precipitation amount, agricultural activities, and level of discharge limits of pollutants. The results also show that the trading program will be more effective for low precipitation years and with stricter discharge limits

  11. Sources of patient uncertainty when reviewing medical disclosure and consent documentation.

    Science.gov (United States)

    Donovan-Kicken, Erin; Mackert, Michael; Guinn, Trey D; Tollison, Andrew C; Breckinridge, Barbara

    2013-02-01

    Despite evidence that medical disclosure and consent forms are ineffective at communicating the risks and hazards of treatment and diagnostic procedures, little is known about exactly why they are difficult for patients to understand. The objective of this research was to examine what features of the forms increase people's uncertainty. Interviews were conducted with 254 individuals. After reading a sample consent form, participants described what they found confusing in the document. With uncertainty management as a theoretical framework, interview responses were analyzed for prominent themes. Four distinct sources of uncertainty emerged from participants' responses: (a) language, (b) risks and hazards, (c) the nature of the procedure, and (d) document composition and format. Findings indicate the value of simplifying medico-legal jargon, signposting definitions of terms, removing language that addresses multiple readers simultaneously, reorganizing bulleted lists of risks, and adding section breaks or negative space. These findings offer suggestions for providing more straightforward details about risks and hazards to patients, not necessarily through greater amounts of information but rather through more clear and sufficient material and better formatting. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. Uncertainties in the fate of nitrogen I: An overview of sources of uncertainty illustrated with a Dutch case study

    NARCIS (Netherlands)

    Kroeze, C.; Aerts, R.; Breemen, van N.; Dam, van D.; Hoek, van der K.; Hofschreuder, P.; Hoosbeek, M.R.; Klein, de J.; Kros, H.; Oene, van H.; Oenema, O.; Tietema, A.; Veeren, van der R.; Verhoeven, H.; Vries, de W.

    2003-01-01

    This study focuses on the uncertainties in the fate of nitrogen (N) in the Netherlands. Nitrogen inputs into the Netherlands in products, by rivers, and by atmospheric deposition, and microbial and industrial fixation of atmospheric N2 amount to about 4450 Gg N y¿1. About 60% of this N is

  13. Limitations of absolute activity determination of I-125 sources

    Energy Technology Data Exchange (ETDEWEB)

    Pelled, O; German, U; Kol, R; Levinson, S; Weinstein, M; Laichter, Y [Israel Atomic Energy Commission, Beersheba (Israel). Nuclear Research Center-Negev; Alphasy, Z [Ben-Gurion Univ. of the Negev, Beersheba (Israel)

    1996-12-01

    A method for absolute determination of the activity of a I-125 source, based on the counting rate values of the 27 keV photons and the coincidence photon peak is given in the literature. It is based on the principle that if a radionuclide emits two photons in coincidence , a measurement of its disintegration rate in the photopeak and in the sum- peak can determinate it`s absolute activity. When using this method , the system calibration is simplified and parameters such as source geometry or source position relative to the detector have no significant influence. However, when the coincidence rate is very low, the application of this method is limited because of the statistics of the coincidence peak (authors).

  14. Stochastic scheduling of renewable micro-grids considering photovoltaic source uncertainties

    International Nuclear Information System (INIS)

    Najibi, Fatemeh; Niknam, Taher

    2015-01-01

    Highlights: • Proposing a complete model for PV panels. • Suggesting a Scenario Based Method to see the uncertainties of problem. • Introduction of a new optimization algorithm for solving MG operation problem. • We propose one modification over the proposed algorithm to make it better working. - Abstract: This paper introduces a new electrical model of a PV array by simulating and tests it on one typical Micro-Grid (MG) to see its performance with regards of optimal energy management of Micro-Grids (MGS). In addition, it introduces a probabilistic framework based on a scenario-based method to overcome all the uncertainties in the optimal energy management of MGs with different renewable power sources, such as Photovoltaic (PV), Wind Turbine (WT), Micro Turbine (MT), and storage devices. Therefore, the uncertainty is considered for WT and PV output power variations, load demand forecasting error and grid bid changes at the same time. It is hard to solve MG problem with all its uncertainty for 24-h time intervals, and consider several equality and inequality at the same time. In fact, in order to resolve this issue, the problem needs one powerful technique that although it converges very fast, it escapes from the local optima. As a result, one modern Dolphin echolocation optimization algorithm (DEOA) is defined to explore all the search space globally. The DEO algorithm uses the ability of echolocation of the dolphins to find the best location. Additionally, the proposed modification method will be introduced in this paper. This method makes the algorithm work better and finds the locations faster. The proposed method is implemented on a test grid-connected MG and satisfying results can be seen after implementation

  15. Sources of uncertainty in hydrological climate impact assessment: a cross-scale study

    Science.gov (United States)

    Hattermann, F. F.; Vetter, T.; Breuer, L.; Su, Buda; Daggupati, P.; Donnelly, C.; Fekete, B.; Flörke, F.; Gosling, S. N.; Hoffmann, P.; Liersch, S.; Masaki, Y.; Motovilov, Y.; Müller, C.; Samaniego, L.; Stacke, T.; Wada, Y.; Yang, T.; Krysnaova, V.

    2018-01-01

    Climate change impacts on water availability and hydrological extremes are major concerns as regards the Sustainable Development Goals. Impacts on hydrology are normally investigated as part of a modelling chain, in which climate projections from multiple climate models are used as inputs to multiple impact models, under different greenhouse gas emissions scenarios, which result in different amounts of global temperature rise. While the goal is generally to investigate the relevance of changes in climate for the water cycle, water resources or hydrological extremes, it is often the case that variations in other components of the model chain obscure the effect of climate scenario variation. This is particularly important when assessing the impacts of relatively lower magnitudes of global warming, such as those associated with the aspirational goals of the Paris Agreement. In our study, we use ANOVA (analyses of variance) to allocate and quantify the main sources of uncertainty in the hydrological impact modelling chain. In turn we determine the statistical significance of different sources of uncertainty. We achieve this by using a set of five climate models and up to 13 hydrological models, for nine large scale river basins across the globe, under four emissions scenarios. The impact variable we consider in our analysis is daily river discharge. We analyze overall water availability and flow regime, including seasonality, high flows and low flows. Scaling effects are investigated by separately looking at discharge generated by global and regional hydrological models respectively. Finally, we compare our results with other recently published studies. We find that small differences in global temperature rise associated with some emissions scenarios have mostly significant impacts on river discharge—however, climate model related uncertainty is so large that it obscures the sensitivity of the hydrological system.

  16. A statistical kinematic source inversion approach based on the QUESO library for uncertainty quantification and prediction

    Science.gov (United States)

    Zielke, Olaf; McDougall, Damon; Mai, Martin; Babuska, Ivo

    2014-05-01

    Seismic, often augmented with geodetic data, are frequently used to invert for the spatio-temporal evolution of slip along a rupture plane. The resulting images of the slip evolution for a single event, inferred by different research teams, often vary distinctly, depending on the adopted inversion approach and rupture model parameterization. This observation raises the question, which of the provided kinematic source inversion solutions is most reliable and most robust, and — more generally — how accurate are fault parameterization and solution predictions? These issues are not included in "standard" source inversion approaches. Here, we present a statistical inversion approach to constrain kinematic rupture parameters from teleseismic body waves. The approach is based a) on a forward-modeling scheme that computes synthetic (body-)waves for a given kinematic rupture model, and b) on the QUESO (Quantification of Uncertainty for Estimation, Simulation, and Optimization) library that uses MCMC algorithms and Bayes theorem for sample selection. We present Bayesian inversions for rupture parameters in synthetic earthquakes (i.e. for which the exact rupture history is known) in an attempt to identify the cross-over at which further model discretization (spatial and temporal resolution of the parameter space) is no longer attributed to a decreasing misfit. Identification of this cross-over is of importance as it reveals the resolution power of the studied data set (i.e. teleseismic body waves), enabling one to constrain kinematic earthquake rupture histories of real earthquakes at a resolution that is supported by data. In addition, the Bayesian approach allows for mapping complete posterior probability density functions of the desired kinematic source parameters, thus enabling us to rigorously assess the uncertainties in earthquake source inversions.

  17. Uncertainty analysis of constant amplitude fatigue test data employing the six parameters random fatigue limit model

    Directory of Open Access Journals (Sweden)

    Leonetti Davide

    2018-01-01

    Full Text Available Estimating and reducing uncertainty in fatigue test data analysis is a relevant task in order to assess the reliability of a structural connection with respect to fatigue. Several statistical models have been proposed in the literature with the aim of representing the stress range vs. endurance trend of fatigue test data under constant amplitude loading and the scatter in the finite and infinite life regions. In order to estimate the safety level of the connection also the uncertainty related to the amount of information available need to be estimated using the methods provided by the theory of statistic. The Bayesian analysis is employed to reduce the uncertainty due to the often small amount of test data by introducing prior information related to the parameters of the statistical model. In this work, the inference of fatigue test data belonging to cover plated steel beams is presented. The uncertainty is estimated by making use of Bayesian and frequentist methods. The 5% quantile of the fatigue life is estimated by taking into account the uncertainty related to the sample size for both a dataset containing few samples and one containing more data. The S-N curves resulting from the application of the employed methods are compared and the effect of the reduction of uncertainty in the infinite life region is quantified.

  18. Analysis of uncertainties and detection limits for the double measurement method of {sup 90}Sr and {sup 89}Sr

    Energy Technology Data Exchange (ETDEWEB)

    Herranz, M., E-mail: m.herranz@ehu.es [Department of Nuclear Engineering and Fluid Mechanics, University of the Basque Country (UPV/EHU), Alameda de Urquijo s/n 48013 Bilbao (Spain); Idoeta, R.; Legarda, F. [Department of Nuclear Engineering and Fluid Mechanics, University of the Basque Country (UPV/EHU), Alameda de Urquijo s/n 48013 Bilbao (Spain)

    2011-08-15

    The determination process of the {sup 90}Sr and {sup 89}Sr contents in a sample, although it involves their radiochemical isolation, results always in a complex measurement process due to the interferences among their respective beta emissions and also among those of the daughter of {sup 90}Sr, {sup 90}Y, a beta emitter as well. In this paper, the process consisting in a double measurement method after the Sr radiochemical isolation is analyzed, developing the formulae to obtain activity concentrations, uncertainties and detection limits. A study of the trend of uncertainties and detection limits as function of the time in which the first measurement since the isolation is done, the delay between the two measurements and the activity concentration of each strontium isotope in the sample is carried out as well. Results show that with a very precise determination of the times involved in the whole process (isolation, measurement and duration of measurements) this method permits a reliable assessment of both strontium radioisotopes. The quicker the first measurement since the isolation is done and the longer the delay between measurements is chosen, the lower are the detection limits and the uncertainties of the activities obtained. - Highlights: > The double measurement method for {sup 90}Sr and {sup 89}Sr determination is analysed. > Uncertainties and detection limits are determined and their dependences studied. > Proposals for the optimization of the method are given.

  19. How phosphorus limitation can control climatic gas sources and sinks

    Science.gov (United States)

    Gypens, Nathalie; Borges, Alberto V.; Ghyoot, Caroline

    2017-04-01

    Since the 1950's, anthropogenic activities severely increased river nutrient loads in European coastal areas. Subsequent implementation of nutrient reduction policies have considerably reduced phosphorus (P) loads from mid-1980's, while nitrogen (N) loads were maintained, inducing a P limitation of phytoplankton growth in many eutrophied coastal areas such as the Southern Bight of the North Sea (SBNS). When dissolved inorganic phosphorous (DIP) is limiting, most phytoplankton organisms are able to indirectly acquire P from dissolved organic P (DOP). We investigate the impact of DOP use on the importance of phytoplankton production and atmospheric fluxes of CO2 and dimethylsulfide (DMS) in the SBNS from 1951 to 2007 using an extended version of the R-MIRO-BIOGAS model. This model includes a description of the ability of phytoplankton organisms to use DOP as a source of P. Results show that primary production can increase up to 70% due to DOP uptake in limiting DIP conditions. Consequently, simulated DMS emissions double while CO2 emissions to the atmosphere decrease, relative to the reference simulation without DOP uptake. At the end of the simulated period (late 2000's), the net direction of air-sea CO2 annual flux, changed from a source to a sink for atmospheric CO2 in response to use of DOP and increase of primary production.

  20. The Role of Type and Source of Uncertainty on the Processing of Climate Models Projections.

    Science.gov (United States)

    Benjamin, Daniel M; Budescu, David V

    2018-01-01

    Scientists agree that the climate is changing due to human activities, but there is less agreement about the specific consequences and their timeline. Disagreement among climate projections is attributable to the complexity of climate models that differ in their structure, parameters, initial conditions, etc. We examine how different sources of uncertainty affect people's interpretation of, and reaction to, information about climate change by presenting participants forecasts from multiple experts. Participants viewed three types of sets of sea-level rise projections: (1) precise, but conflicting ; (2) imprecise , but agreeing, and (3) hybrid that were both conflicting and imprecise. They estimated the most likely sea-level rise, provided a range of possible values and rated the sets on several features - ambiguity, credibility, completeness, etc. In Study 1, everyone saw the same hybrid set. We found that participants were sensitive to uncertainty between sources, but not to uncertainty about which model was used. The impacts of conflict and imprecision were combined for estimation tasks and compromised for feature ratings . Estimates were closer to the experts' original projections, and sets were rated more favorably under imprecision. Estimates were least consistent with (narrower than) the experts in the hybrid condition, but participants rated the conflicting set least favorably. In Study 2, we investigated the hybrid case in more detail by creating several distinct interval sets that combine conflict and imprecision. Two factors drive perceptual differences: overlap - the structure of the forecast set (whether intersecting, nested, tangent, or disjoint) - and a symmetry - the balance of the set. Estimates were primarily driven by asymmetry, and preferences were primarily driven by overlap. Asymmetric sets were least consistent with the experts: estimated ranges were narrower, and estimates of the most likely value were shifted further below the set mean

  1. Evaluation of uncertainties in the calibration of radiation personal monitor with Cesium-137 source

    International Nuclear Information System (INIS)

    Mirapalheta, Tatiane; Alexandre, Anderson; Costa, Camila; Batista, Gilmar; Paulino, Thyago; Albuquerque, Marcos; Universidade do Estado do Rio de Janeiro

    2016-01-01

    This work shows the entire calibration process of an individual monitor, focusing on radiation protection, in health, correlating these measures associated uncertainties. The results show an expanded uncertainty of 5.81% for dose rate measurements and an expanded uncertainty of 5.61% for integrated dose measurements, these uncertainties have been evaluated the type A and type B with its components. (author)

  2. Uncertainty and Detection Limit in Determination of 89,90Sr by Cherenkov Counting

    International Nuclear Information System (INIS)

    Grahek, Z.; Karanovic, G.; Nodilo, M.

    2013-01-01

    The methodology for the rapid determination of 89,90Sr in normal and emergency situations is given. Methodology is based on simultaneous separation of strontium and yttrium from samples and quantitative 89,90Sr determination by Cherenkov counting within three days. Methodology for quantitative determination by Cherenkov counting based on following changes of sample activity during the time is described and discussed. It has been shown that 89,90Sr can be determined with acceptable accuracy when 89Sr/90Sr ratio is over 10:1. Obtained results show that by using low level liquid scintillation counter it can be possible to determine 89Sr and 90Sr in broad range of concentration activities (1 - 1000 Bq (kgL) -1 ) with uncertainties below 10% within 2-3 days. Results also show that accuracy of determination of 89Sr (and 90Sr) depends on determination of difference between separation and counting time when activity ratio of 89Sr/90Sr is high. Analysis of combined uncertainty shows that it mainly depends on uncertainty of efficiency and recovery determination, uncertainty of activities determination for both isotopes and level of background radiation. Portion of each in combined uncertainty depend on level of activity of each isotope and its activity ratio.(author)

  3. Controlled source electromagnetic data analysis with seismic constraints and rigorous uncertainty estimation in the Black Sea

    Science.gov (United States)

    Gehrmann, R. A. S.; Schwalenberg, K.; Hölz, S.; Zander, T.; Dettmer, J.; Bialas, J.

    2016-12-01

    In 2014 an interdisciplinary survey was conducted as part of the German SUGAR project in the Western Black Sea targeting gas hydrate occurrences in the Danube Delta. Marine controlled source electromagnetic (CSEM) data were acquired with an inline seafloor-towed array (BGR), and a two-polarization horizontal ocean-bottom source and receiver configuration (GEOMAR). The CSEM data are co-located with high-resolution 2-D and 3-D seismic reflection data (GEOMAR). We present results from 2-D regularized inversion (MARE2DEM by Kerry Key), which provides a smooth model of the electrical resistivity distribution beneath the source and multiple receivers. The 2-D approach includes seafloor topography and structural constraints from seismic data. We estimate uncertainties from the regularized inversion and compare them to 1-D Bayesian inversion results. The probabilistic inversion for a layered subsurface treats the parameter values and the number of layers as unknown by applying reversible-jump Markov-chain Monte Carlo sampling. A non-diagonal data covariance matrix obtained from residual error analysis accounts for correlated errors. The resulting resistivity models show generally high resistivity values between 3 and 10 Ωm on average which can be partly attributed to depleted pore water salinities due to sea-level low stands in the past, and locally up to 30 Ωm which is likely caused by gas hydrates. At the base of the gas hydrate stability zone resistivities rise up to more than 100 Ωm which could be due to gas hydrate as well as a layer of free gas underneath. However, the deeper parts also show the largest model parameter uncertainties. Archie's Law is used to derive estimates of the gas hydrate saturation, which vary between 30 and 80% within the anomalous layers considering salinity and porosity profiles from a distant DSDP bore hole.

  4. Updating Environmental Media Concentration Limits and Uncertainty factors in the ERICA Tool

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J.E.; Hosseini, A. [Norwegian Radiation Protection Authority, P.O. Box 55, N-1332 Oesteraas (Norway); Alfonso, B.; Avila, R. [Facilia AB, S-167 51 Bromma (Sweden); Beresford, N.A. [Centre for Ecology and Hydrology, CEH-Lancaster, Lancaster Environment Centre, Library Avenue, Bailrigg, Lancaster LA 1 4AP (United Kingdom); Copplestone, D. [Dept. Biological and Environmental Sciences, University of Stirling, Stirling, FK9 4LA (United Kingdom)

    2014-07-01

    Tiered approaches have become a standard means of structuring information in the process of conducting environmental risk assessments. For cases involving the assessment of impacts on wildlife from ionising radiation, the ERICA integrated approach and its supporting software (The ERICA Tool) provides such a structure, splitting the system into two generic screening tiers and a third site-specific tier. The first Tier is very simple, based around Environmental Media Concentration Limits, EMCLs, and requires minimal input from the assessor. The second Tier, although still a screening tier, calculates dose rates and requires more detailed input from the assessor allowing for scrutiny and editing of default parameters in the process. A key element of Tier 2 involves the application of Uncertainty Factors, UFs. Such factors reflect our knowledge concerning probability distribution functions and provide a way of incorporating conservatism into the assessment by considering high percentile values in underlying parameters. Following its launch in 2007, there have been significant developments regarding certain components of the ERICA integrated approach. Most notably, an extended international collation of concentration ratio data has precipitated the need to update parameter values in the Tools databases. In addition, more considered guidance has been developed with regards to filling knowledge gaps in the absence of transfer data. Furthermore, the efficacy of the methods used in assigning probability distribution functions has been questioned leading to an acknowledgement from the developers that the methods were not described in enough detail nor were the justifications for applying the selected approach provided in a convincing way. This has implications for the EMCL values which are derived probabilistically using parameters including concentration ratios. Furthermore, there are implications for UF derivation that relies upon a robust consideration of underlying

  5. Sources of uncertainties in modelling black carbon at the global scale

    Directory of Open Access Journals (Sweden)

    E. Vignati

    2010-03-01

    Full Text Available Our understanding of the global black carbon (BC cycle is essentially qualitative due to uncertainties in our knowledge of its properties. This work investigates two source of uncertainties in modelling black carbon: those due to the use of different schemes for BC ageing and its removal rate in the global Transport-Chemistry model TM5 and those due to the uncertainties in the definition and quantification of the observations, which propagate through to both the emission inventories, and the measurements used for the model evaluation.

    The schemes for the atmospheric processing of black carbon that have been tested with the model are (i a simple approach considering BC as bulk aerosol and a simple treatment of the removal with fixed 70% of in-cloud black carbon concentrations scavenged by clouds and removed when rain is present and (ii a more complete description of microphysical ageing within an aerosol dynamics model, where removal is coupled to the microphysical properties of the aerosol, which results in a global average of 40% in-cloud black carbon that is scavenged in clouds and subsequently removed by rain, thus resulting in a longer atmospheric lifetime. This difference is reflected in comparisons between both sets of modelled results and the measurements. Close to the sources, both anthropogenic and vegetation fire source regions, the model results do not differ significantly, indicating that the emissions are the prevailing mechanism determining the concentrations and the choice of the aerosol scheme does not influence the levels. In more remote areas such as oceanic and polar regions the differences can be orders of magnitude, due to the differences between the two schemes. The more complete description reproduces the seasonal trend of the black carbon observations in those areas, although not always the magnitude of the signal, while the more simplified approach underestimates black carbon concentrations by orders of

  6. Effects of Heterogeneity and Uncertainties in Sources and Initial and Boundary Conditions on Spatiotemporal Variations of Groundwater Levels

    Science.gov (United States)

    Zhang, Y. K.; Liang, X.

    2014-12-01

    Effects of aquifer heterogeneity and uncertainties in source/sink, and initial and boundary conditions in a groundwater flow model on the spatiotemporal variations of groundwater level, h(x,t), were investigated. Analytical solutions for the variance and covariance of h(x, t) in an unconfined aquifer described by a linearized Boussinesq equation with a white noise source/sink and a random transmissivity field were derived. It was found that in a typical aquifer the error in h(x,t) in early time is mainly caused by the random initial condition and the error reduces as time goes to reach a constant error in later time. The duration during which the effect of the random initial condition is significant may last a few hundred days in most aquifers. The constant error in groundwater in later time is due to the combined effects of the uncertain source/sink and flux boundary: the closer to the flux boundary, the larger the error. The error caused by the uncertain head boundary is limited in a narrow zone near the boundary but it remains more or less constant over time. The effect of the heterogeneity is to increase the variation of groundwater level and the maximum effect occurs close to the constant head boundary because of the linear mean hydraulic gradient. The correlation of groundwater level decreases with temporal interval and spatial distance. In addition, the heterogeneity enhances the correlation of groundwater level, especially at larger time intervals and small spatial distances.

  7. Constraining Parameter Uncertainty in Simulations of Water and Heat Dynamics in Seasonally Frozen Soil Using Limited Observed Data

    Directory of Open Access Journals (Sweden)

    Mousong Wu

    2016-02-01

    Full Text Available Water and energy processes in frozen soils are important for better understanding hydrologic processes and water resources management in cold regions. To investigate the water and energy balance in seasonally frozen soils, CoupModel combined with the generalized likelihood uncertainty estimation (GLUE method was used. Simulation work on water and heat processes in frozen soil in northern China during the 2012/2013 winter was conducted. Ensemble simulations through the Monte Carlo sampling method were generated for uncertainty analysis. Behavioral simulations were selected based on combinations of multiple model performance index criteria with respect to simulated soil water and temperature at four depths (5 cm, 15 cm, 25 cm, and 35 cm. Posterior distributions for parameters related to soil hydraulic, radiation processes, and heat transport indicated that uncertainties in both input and model structures could influence model performance in modeling water and heat processes in seasonally frozen soils. Seasonal courses in water and energy partitioning were obvious during the winter. Within the day-cycle, soil evaporation/condensation and energy distributions were well captured and clarified as an important phenomenon in the dynamics of the energy balance system. The combination of the CoupModel simulations with the uncertainty-based calibration method provides a way of understanding the seasonal courses of hydrology and energy processes in cold regions with limited data. Additional measurements may be used to further reduce the uncertainty of regulating factors during the different stages of freezing–thawing.

  8. Limitations of acceptability curves for presenting uncertainty in cost-effectiveness analysis

    NARCIS (Netherlands)

    Groot Koerkamp, Bas; Hunink, M. G. Myriam; Stijnen, Theo; Hammitt, James K.; Kuntz, Karen M.; Weinstein, Milton C.

    2007-01-01

    Clinical journals increasingly illustrate uncertainty about the cost and effect of health care interventions using cost-effectiveness acceptability curves (CEACs). CEACs present the probability that each competing alternative is optimal for a range of values of the cost-effectiveness threshold. The

  9. Proportional Odds Logistic Regression - Effective Means of Dealing with Limited Uncertainty in Dichotomizing Clinical Outcomes

    Czech Academy of Sciences Publication Activity Database

    Valenta, Zdeněk; Pitha, J.; Poledne, R.

    2006-01-01

    Roč. 25, č. 24 (2006), s. 4227-4234 ISSN 0277-6715 R&D Projects: GA MZd NA7512 Institutional research plan: CEZ:AV0Z10300504 Keywords : proportional odds logistic regression * dichotomized outcomes * uncertainty Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.737, year: 2006

  10. Limits and Prospects of Renewable Energy Sources in Italy

    International Nuclear Information System (INIS)

    Coiante, D.

    2008-01-01

    The Italian energy balance for year 2005 is discussed with particular attention on renewable energy production. The potentials of renewable sources are evaluated in terms of energy density that can be obtained from occupied plant area. About 20000 km 2 of sunny barren lands are present in South of Italy, particularly suitable for photovoltaic plants and that corresponds to a potential production of 144 Mtep of primary energy. Therefore, in theory, the photovoltaic energy potential is comparable with energy balance. The grid connection limit due to intermittent power generation of photovoltaic and wind energy systems is considered in relation with the stability of grid power level. Assuming a 25% maximum grid penetration of intermittent power with respect to capacity of active thermoelectric generators, the renewable energy contribution amounts to about 2% of annual energy balance. In front of expectations for a larger contribution, the practical result is the renewable energy production of present systems is marginal, unsuitable for counteracting the global climate crisis. The conclusion is that, for exploiting the large renewable energy potential, is necessary to implement the plants with an energy storage system able to overcome the source intermittency. Without this improvement, the expectations on renewable energy sources could be disappointed. [it

  11. Uncertainty Quantification in Earthquake Source Characterization with Probabilistic Centroid Moment Tensor Inversion

    Science.gov (United States)

    Dettmer, J.; Benavente, R. F.; Cummins, P. R.

    2017-12-01

    This work considers probabilistic, non-linear centroid moment tensor inversion of data from earthquakes at teleseismic distances. The moment tensor is treated as deviatoric and centroid location is parametrized with fully unknown latitude, longitude, depth and time delay. The inverse problem is treated as fully non-linear in a Bayesian framework and the posterior density is estimated with interacting Markov chain Monte Carlo methods which are implemented in parallel and allow for chain interaction. The source mechanism and location, including uncertainties, are fully described by the posterior probability density and complex trade-offs between various metrics are studied. These include the percent of double couple component as well as fault orientation and the probabilistic results are compared to results from earthquake catalogs. Additional focus is on the analysis of complex events which are commonly not well described by a single point source. These events are studied by jointly inverting for multiple centroid moment tensor solutions. The optimal number of sources is estimated by the Bayesian information criterion to ensure parsimonious solutions. [Supported by NSERC.

  12. 40 CFR 401.12 - Law authorizing establishment of effluent limitations guidelines for existing sources, standards...

    Science.gov (United States)

    2010-07-01

    ... effluent limitations guidelines for existing sources, standards of performance for new sources and... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS GENERAL PROVISIONS § 401.12 Law authorizing establishment of effluent limitations guidelines for existing sources, standards of performance...

  13. Reduced dose uncertainty in MRI-based polymer gel dosimetry using parallel RF transmission with multiple RF sources

    International Nuclear Information System (INIS)

    Sang-Young Kim; Jung-Hoon Lee; Jin-Young Jung; Do-Wan Lee; Seu-Ran Lee; Bo-Young Choe; Hyeon-Man Baek; Korea University of Science and Technology, Daejeon; Dae-Hyun Kim; Jung-Whan Min; Ji-Yeon Park

    2014-01-01

    In this work, we present the feasibility of using a parallel RF transmit with multiple RF sources imaging method (MultiTransmit imaging) in polymer gel dosimetry. Image quality and B 1 field homogeneity was statistically better in the MultiTransmit imaging method than in conventional single source RF transmission imaging method. In particular, the standard uncertainty of R 2 was lower on the MultiTransmit images than on the conventional images. Furthermore, the MultiTransmit measurement showed improved dose resolution. Improved image quality and B 1 homogeneity results in reduced dose uncertainty, thereby suggesting the feasibility of MultiTransmit MR imaging in gel dosimetry. (author)

  14. Spatial GHG Inventory: Analysis of Uncertainty Sources. A Case Study for Ukraine

    International Nuclear Information System (INIS)

    Bun, R.; Gusti, M.; Kujii, L.; Tokar, O.; Tsybrivskyy, Y.; Bun, A.

    2007-01-01

    A geoinformation technology for creating spatially distributed greenhouse gas inventories based on a methodology provided by the Intergovernmental Panel on Climate Change and special software linking input data, inventory models, and a means for visualization are proposed. This technology opens up new possibilities for qualitative and quantitative spatially distributed presentations of inventory uncertainty at the regional level. Problems concerning uncertainty and verification of the distributed inventory are discussed. A Monte Carlo analysis of uncertainties in the energy sector at the regional level is performed, and a number of simulations concerning the effectiveness of uncertainty reduction in some regions are carried out. Uncertainties in activity data have a considerable influence on overall inventory uncertainty, for example, the inventory uncertainty in the energy sector declines from 3.2 to 2.0% when the uncertainty of energy-related statistical data on fuels combusted in the energy industries declines from 10 to 5%. Within the energy sector, the 'energy industries' subsector has the greatest impact on inventory uncertainty. The relative uncertainty in the energy sector inventory can be reduced from 2.19 to 1.47% if the uncertainty of specific statistical data on fuel consumption decreases from 10 to 5%. The 'energy industries' subsector has the greatest influence in the Donetsk oblast. Reducing the uncertainty of statistical data on electricity generation in just three regions - the Donetsk, Dnipropetrovsk, and Luhansk oblasts - from 7.5 to 4.0% results in a decline from 2.6 to 1.6% in the uncertainty in the national energy sector inventory

  15. Limits, discovery and cut optimization for a Poisson process with uncertainty in background and signal efficiency: TRolke 2.0

    Science.gov (United States)

    Lundberg, J.; Conrad, J.; Rolke, W.; Lopez, A.

    2010-03-01

    A C++ class was written for the calculation of frequentist confidence intervals using the profile likelihood method. Seven combinations of Binomial, Gaussian, Poissonian and Binomial uncertainties are implemented. The package provides routines for the calculation of upper and lower limits, sensitivity and related properties. It also supports hypothesis tests which take uncertainties into account. It can be used in compiled C++ code, in Python or interactively via the ROOT analysis framework. Program summaryProgram title: TRolke version 2.0 Catalogue identifier: AEFT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: MIT license No. of lines in distributed program, including test data, etc.: 3431 No. of bytes in distributed program, including test data, etc.: 21 789 Distribution format: tar.gz Programming language: ISO C++. Computer: Unix, GNU/Linux, Mac. Operating system: Linux 2.6 (Scientific Linux 4 and 5, Ubuntu 8.10), Darwin 9.0 (Mac-OS X 10.5.8). RAM:˜20 MB Classification: 14.13. External routines: ROOT ( http://root.cern.ch/drupal/) Nature of problem: The problem is to calculate a frequentist confidence interval on the parameter of a Poisson process with statistical or systematic uncertainties in signal efficiency or background. Solution method: Profile likelihood method, Analytical Running time:<10 seconds per extracted limit.

  16. INAA - application and limitation

    International Nuclear Information System (INIS)

    Heydorn, K.

    1990-01-01

    The uncertainties associated with performing Instrumental Neutron Activation Analysis (INAA) are discussed in relation to their category. The Comite International de Poids et Mesure (CIPM) distinguishes between uncertainties according to how their contribution to the overall uncertainty is evaluated. INAA is a potentially definitive method, if all sources of uncertainty can be accounted for. The limitation of the method is reached when uncertainties, which cannot be accounted for, assume significance, and the method cannot be brought in statistical control. (orig.)

  17. PREFACE: Diagnostics for electrical discharge light sources: pushing the limits Diagnostics for electrical discharge light sources: pushing the limits

    Science.gov (United States)

    Zissis, Georges; Haverlag, Marco

    2010-06-01

    Light sources play an indispensable role in the daily life of any human being. Quality of life, health and urban security related to traffic and crime prevention depend on light and on its quality. In fact, every day approximately 30 billion electric light sources operate worldwide. These electric light sources consume almost 19% of worldwide electricity production. Finding new ways to light lamps is a challenge where the stakes are scientific, technological, economic and environmental. The production of more efficient light sources is a sustainable solution for humanity. There are many opportunities for not only enhancing the efficiency and reliability of lighting systems but also for improving the quality of light as seen by the end user. This is possible through intelligent use of new technologies, deep scientific understanding of the operating principles of light sources and knowledge of the varied human requirements for different types of lighting in different settings. A revolution in the domain of light source technology is on the way: high brightness light emitting diodes arriving in the general lighting market, together with organic LEDs (OLEDs), are producing spectacular advances. However, unlike incandescence, electrical discharge lamps are far from disappearing from the market. In addition, new generations of discharge lamps based on molecular radiators are becoming a reality. There are still many scientific and technological challenges to be raised in this direction. Diagnostics are important for understanding the fundamental mechanisms taking place in the discharge plasma. This understanding is an absolute necessity for system optimization leading to more efficient and high quality light sources. The studied medium is rather complex, but new diagnostic techniques coupled to innovative ideas and powerful tools have been developed in recent years. This cluster issue of seven papers illustrates these efforts. The selected papers cover all domains, from

  18. ACCOUNTiNG, ENGINEERING, OR ADVERTISING? LIMITED LIABILITY, THE COMPANY PROSPECTUS, AND THE LANGUAGE OF UNCERTAINTY IN VICTORIAN BRITAIN

    Directory of Open Access Journals (Sweden)

    Wade E. Shilts

    2004-01-01

    Full Text Available This paper looks at a particularly puzzling historical example of delay in the use of the law, the under-use by Victorian Britain of the general incorporation statutes passed between 1844 and 1862. Comparison of the rhetoric of company prospectuses of 1824-1862 and 1898 suggests that uncertainty about the meaning of “incorporation with limited liability” among those who might have benefited from it may have persisted for decades following the statute’s passage. Continuing uncertainty meant continuing interpretation costs, and continuing interpretation costs meant insufficient interpretation: until each law user involved with an enterprise interpreted and came to understand the rule’s meaning, less than fully realized.

  19. New entropic uncertainty relations and tests of PMD-SQS-optimal limits in pion-nucleus scattering

    International Nuclear Information System (INIS)

    Ion, D.B.; Ion, M.L.

    2002-01-01

    In this paper we define a new kind of quantum entropy, namely, the nonextensivity conjugated entropy S Jθ (p,q) bar.Then we prove the optimal nonextensivity conjugated entropic uncertainty relations (ONC-EUR) as well as optimal nonextensivity conjugated entropic uncertainty bands (ONC E UB). The results of the first experimental test of ONC-EUB in the pion-nucleus scattering, obtained by using 49-sets of experimental phase shift analysis, are presented. So, strong evidences for the saturation of the PMD-SQS-optimum limit are obtained with high accuracy (confidence level > 99%) for the nonextensivities: 1/2 ≤ p ≤ 2/3 and q = p/(2p-1). (authors)

  20. Validation and uncertainty quantification of detector response functions for a 1″×2″ NaI collimated detector intended for inverse radioisotope source mapping applications

    Science.gov (United States)

    Nelson, N.; Azmy, Y.; Gardner, R. P.; Mattingly, J.; Smith, R.; Worrall, L. G.; Dewji, S.

    2017-11-01

    deviation in most cases and good reduced chi-square values. The highest sources of uncertainty were identified as the energy calibration polynomial factor (due to limited source availability and NaI resolution) and the Ba-133 peak fit (only a very weak source was available), which were 20% and 10%, respectively.

  1. Evaluation of uncertainty sources and propagation from irradiance sensors to PV yield

    OpenAIRE

    Mariottini, Francesco; Gottschalg, Ralph; Betts, Tom; Zhu, Jiang

    2018-01-01

    This work quantifies the uncertainties of a pyranometer. Sensitivity to errors is analysed regarding the effects generated by adopting different time resolutions. Estimation of irradiance measurand and error is extended throughout an annual data set. This study represents an attempt to provide a more exhaustive overview of both systematic (i.e. physical) and random uncertainties in the evaluation of pyranometer measurements. Starting from expanded uncertainty in a monitored ...

  2. Quantifying the uncertainties of China's emission inventory for industrial sources: From national to provincial and city scales

    Science.gov (United States)

    Zhao, Yu; Zhou, Yaduan; Qiu, Liping; Zhang, Jie

    2017-09-01

    A comprehensive uncertainty analysis was conducted on emission inventories for industrial sources at national (China), provincial (Jiangsu), and city (Nanjing) scales for 2012. Based on various methods and data sources, Monte-Carlo simulation was applied at sector level for national inventory, and at plant level (whenever possible) for provincial and city inventories. The uncertainties of national inventory were estimated at -17-37% (expressed as 95% confidence intervals, CIs), -21-35%, -19-34%, -29-40%, -22-47%, -21-54%, -33-84%, and -32-92% for SO2, NOX, CO, TSP (total suspended particles), PM10, PM2.5, black carbon (BC), and organic carbon (OC) emissions respectively for the whole country. At provincial and city levels, the uncertainties of corresponding pollutant emissions were estimated at -15-18%, -18-33%, -16-37%, -20-30%, -23-45%, -26-50%, -33-79%, and -33-71% for Jiangsu, and -17-22%, -10-33%, -23-75%, -19-36%, -23-41%, -28-48%, -45-82%, and -34-96% for Nanjing, respectively. Emission factors (or associated parameters) were identified as the biggest contributors to the uncertainties of emissions for most source categories except iron & steel production in the national inventory. Compared to national one, uncertainties of total emissions in the provincial and city-scale inventories were not significantly reduced for most species with an exception of SO2. For power and other industrial boilers, the uncertainties were reduced, and the plant-specific parameters played more important roles to the uncertainties. Much larger PM10 and PM2.5 emissions for Jiangsu were estimated in this provincial inventory than other studies, implying the big discrepancies on data sources of emission factors and activity data between local and national inventories. Although the uncertainty analysis of bottom-up emission inventories at national and local scales partly supported the ;top-down; estimates using observation and/or chemistry transport models, detailed investigations and

  3. Automated gauge block pair length difference calibration and associated uncertainty sources

    International Nuclear Information System (INIS)

    Oliveira, W Jr; França, R S

    2015-01-01

    A reduction for interferometric uncertainties in length difference at gauge block pairs is presented. An automated processing designed to compensate geometric fringe visualization effects and four-alternate wringing technique are used to achieve small combined uncertainties for length difference calibrations, maintaining a good compliance with the EAL-G21 determinations. (paper)

  4. Supersymmetry Breaking as a new source for the Generalized Uncertainty Principle

    OpenAIRE

    Faizal, Mir

    2016-01-01

    In this letter, we will demonstrate that the breaking of supersymmetry by a non-anticommutative deformation can be used to generate the generalized uncertainty principle. We will analyze the physical reasons for this observation, in the framework of string theory. We also discuss the relation between the generalized uncertainty principle and the Lee–Wick field theories.

  5. Supersymmetry breaking as a new source for the generalized uncertainty principle

    Energy Technology Data Exchange (ETDEWEB)

    Faizal, Mir, E-mail: mirfaizalmir@gmail.com

    2016-06-10

    In this letter, we will demonstrate that the breaking of supersymmetry by a non-anticommutative deformation can be used to generate the generalized uncertainty principle. We will analyze the physical reasons for this observation, in the framework of string theory. We also discuss the relation between the generalized uncertainty principle and the Lee–Wick field theories.

  6. Statistically based uncertainty analysis for ranking of component importance in the thermal-hydraulic safety analysis of the Advanced Neutron Source Reactor

    International Nuclear Information System (INIS)

    Wilson, G.E.

    1992-01-01

    The Analytic Hierarchy Process (AHP) has been used to help determine the importance of components and phenomena in thermal-hydraulic safety analyses of nuclear reactors. The AHP results are based, in part on expert opinion. Therefore, it is prudent to evaluate the uncertainty of the AHP ranks of importance. Prior applications have addressed uncertainty with experimental data comparisons and bounding sensitivity calculations. These methods work well when a sufficient experimental data base exists to justify the comparisons. However, in the case of limited or no experimental data the size of the uncertainty is normally made conservatively large. Accordingly, the author has taken another approach, that of performing a statistically based uncertainty analysis. The new work is based on prior evaluations of the importance of components and phenomena in the thermal-hydraulic safety analysis of the Advanced Neutron Source Reactor (ANSR), a new facility now in the design phase. The uncertainty during large break loss of coolant, and decay heat removal scenarios is estimated by assigning a probability distribution function (pdf) to the potential error in the initial expert estimates of pair-wise importance between the components. Using a Monte Carlo sampling technique, the error pdfs are propagated through the AHP software solutions to determine a pdf of uncertainty in the system wide importance of each component. To enhance the generality of the results, study of one other problem having different number of elements is reported, as are the effects of a larger assumed pdf error in the expert ranks. Validation of the Monte Carlo sample size and repeatability are also documented

  7. Diagnostics for electrical discharge light sources : pushing the limits

    NARCIS (Netherlands)

    Zissis, G.; Haverlag, M.

    2010-01-01

    Light sources play an indispensable role in the daily life of any human being. Quality of life, health and urban security related to traffic and crime prevention depend on light and on its quality. In fact, every day approximately 30 billion electric light sources operate worldwide. These electric

  8. Communicating Uncertain Science to the Public: How Amount and Source of Uncertainty Impact Fatalism, Backlash, and Overload

    Science.gov (United States)

    Jensen, Jakob D.; Pokharel, Manusheela; Scherr, Courtney L.; King, Andy J.; Brown, Natasha; Jones, Christina

    2016-01-01

    Public dissemination of scientific research often focuses on the finding (e.g., nanobombs kill lung cancer) rather than the uncertainty/limitations (e.g., in mice). Adults (N = 880) participated in an experiment where they read a manipulated news report about cancer research (a) that contained either low or high uncertainty (b) that was attributed to the scientists responsible for the research (disclosure condition) or an unaffiliated scientist (dueling condition). Compared to the dueling condition, the disclosure condition triggered less prevention-focused cancer fatalism and nutritional backlash. PMID:26973157

  9. Strategies for source space limitation in tomographic inverse procedures

    International Nuclear Information System (INIS)

    George, J.S.; Lewis, P.S.; Schlitt, H.A.; Kaplan, L.; Gorodnitsky, I.; Wood, C.C.

    1994-01-01

    The use of magnetic recordings for localization of neural activity requires the solution of an ill-posed inverse problem: i.e. the determination of the spatial configuration, orientation, and timecourse of the currents that give rise to a particular observed field distribution. In its general form, this inverse problem has no unique solution; due to superposition and the existence of silent source configurations, a particular magnetic field distribution at the head surface could be produced by any number of possible source configurations. However, by making assumptions concerning the number and properties of neural sources, it is possible to use numerical minimization techniques to determine the source model parameters that best account for the experimental observations while satisfying numerical or physical criteria. In this paper the authors describe progress on the development and validation of inverse procedures that produce distributed estimates of neuronal currents. The goal is to produce a temporal sequence of 3-D tomographic reconstructions of the spatial patterns of neural activation. Such approaches have a number of advantages, in principle. Because they do not require estimates of model order and parameter values (beyond specification of the source space), they minimize the influence of investigator decisions and are suitable for automated analyses. These techniques also allow localization of sources that are not point-like; experimental studies of cognitive processes and of spontaneous brain activity are likely to require distributed source models

  10. 75 FR 10438 - Effluent Limitations Guidelines and Standards for the Construction and Development Point Source...

    Science.gov (United States)

    2010-03-08

    ... Effluent Limitations Guidelines and Standards for the Construction and Development Point Source Category... technology-based Effluent Limitations Guidelines and New Source Performance Standards for the Construction... technology-based Effluent Limitations Guidelines and New Source Performance Standards for the Construction...

  11. Uncertainty and Sensitivity of Direct Economic Flood Damages: the FloodRisk Free and Open-Source Software

    Science.gov (United States)

    Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.

    2017-12-01

    The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push

  12. Sources for high frequency heating. Performance and limitations

    International Nuclear Information System (INIS)

    Le Gardeur, R.

    1976-01-01

    The various problems encountered in high frequency heating of plasmas can be decomposed into three spheres of action: theoretical development, antenna designing, and utilization of power sources. By classifying heating into three spectral domains, present and future needs are enumerated. Several specific antenna designs are treated. High frequency power sources are reviewed. The actual development of the gyratron is discussed in view of future needs in very high frequency heating of plasmas [fr

  13. Application of Interval Arithmetic in the Evaluation of Transfer Capabilities by Considering the Sources of Uncertainty

    Directory of Open Access Journals (Sweden)

    Prabha Umapathy

    2009-01-01

    Full Text Available Total transfer capability (TTC is an important index in a power system with large volume of inter-area power exchanges. This paper proposes a novel technique to determine the TTC and its confidence intervals in the system by considering the uncertainties in the load and line parameters. The optimal power flow (OPF method is used to obtain the TTC. Variations in the load and line parameters are incorporated using the interval arithmetic (IA method. The IEEE 30 bus test system is used to illustrate the proposed methodology. Various uncertainties in the line, load and both line and load are incorporated in the evaluation of total transfer capability. From the results, it is observed that the solutions obtained through the proposed method provide much wider information in terms of closed interval form which is more useful in ensuring secured operation of the interconnected system in the presence of uncertainties in load and line parameters.

  14. Uncertainty quantification and error analysis

    Energy Technology Data Exchange (ETDEWEB)

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  15. Advanced numerical methods for uncertainty reduction when predicting heat exchanger dynamic stability limits: Review and perspectives

    International Nuclear Information System (INIS)

    Longatte, E.; Baj, F.; Hoarau, Y.; Braza, M.; Ruiz, D.; Canteneur, C.

    2013-01-01

    Highlights: ► Proposal of hybrid computational methods for investigating dynamical system stability. ► Modeling turbulence disequilibrium due to interaction with moving solid boundaries. ► Providing computational procedure for large size system solution approximation through model reduction. -- Abstract: This article proposes a review of recent and current developments in the modeling and advanced numerical methods used to simulate large-size systems involving multi-physics in the field of mechanics. It addresses the complex issue of stability analysis of dynamical systems submitted to external turbulent flows and aims to establish accurate stability maps applicable to heat exchanger design. The purpose is to provide dimensionless stability limit modeling that is suitable for a variety of configurations and is as accurate as possible in spite of the large scale of the systems to be considered. The challenge lies in predicting local effects that may impact global systems. A combination of several strategies that are suited concurrently to multi-physics, multi-scale and large-size system computation is therefore required. Based on empirical concepts, the heuristic models currently used in the framework of standard stability analysis suffer from a lack of predictive capabilities. On the other hand, numerical approaches based on fully-coupled fluid–solid dynamics system computation remain expensive due to the multi-physics patterns of physics and the large number of degrees of freedom involved. In this context, since experimentation cannot be achieved and numerical simulation is unavoidable but prohibitive, a hybrid strategy is proposed in order to take advantage of both numerical local solutions and empirical global solutions

  16. Alpine grassland soil organic carbon stock and its uncertainty in the three rivers source region of the Tibetan Plateau.

    Directory of Open Access Journals (Sweden)

    Xiaofeng Chang

    Full Text Available Alpine grassland of the Tibetan Plateau is an important component of global soil organic carbon (SOC stocks, but insufficient field observations and large spatial heterogeneity leads to great uncertainty in their estimation. In the Three Rivers Source Region (TRSR, alpine grasslands account for more than 75% of the total area. However, the regional carbon (C stock estimate and their uncertainty have seldom been tested. Here we quantified the regional SOC stock and its uncertainty using 298 soil profiles surveyed from 35 sites across the TRSR during 2006-2008. We showed that the upper soil (0-30 cm depth in alpine grasslands of the TRSR stores 2.03 Pg C, with a 95% confidence interval ranging from 1.25 to 2.81 Pg C. Alpine meadow soils comprised 73% (i.e. 1.48 Pg C of the regional SOC estimate, but had the greatest uncertainty at 51%. The statistical power to detect a deviation of 10% uncertainty in grassland C stock was less than 0.50. The required sample size to detect this deviation at a power of 90% was about 6-7 times more than the number of sample sites surveyed. Comparison of our observed SOC density with the corresponding values from the dataset of Yang et al. indicates that these two datasets are comparable. The combined dataset did not reduce the uncertainty in the estimate of the regional grassland soil C stock. This result could be mainly explained by the underrepresentation of sampling sites in large areas with poor accessibility. Further research to improve the regional SOC stock estimate should optimize sampling strategy by considering the number of samples and their spatial distribution.

  17. A study on the assessment of safety culture impacts on risk of nuclear power plants using common uncertainty source model

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Bang, Young Suk; Chung, Chang Hyun; Jeong, Ji Hwan

    2004-01-01

    Since International Safety Advisory Group (INSAG) introduced term 'safety culture', it has been widely recognized that safety culture has an important role in safety of nuclear power plants. Research on the safety culture can be divided in the following two parts. 1) Assessment of safety culture (by interview, questionnaire, etc.) 2) Assessment of link between safety culture and safety of nuclear power plants. There is a substantial body of literature that addresses the first part, but there is much less work that addresses the second part. To address the second part, most work focused on the development of model incorporating safety culture into Probabilistic Safety Assessment (PSA). One of the most advanced methodology in the area of incorporating safety culture quantitatively into PSA is System Dynamics (SD) model developed by Kwak et al. It can show interactions among various factors which affect employees' productivity and job quality. Also various situations in nuclear power plant can be simulated and time-dependent risk can be recalculated with this model. But this model does not consider minimal cut set (MCS) dependency and uncertainty of risk. Another well-known methodology is Work Process Analysis Model (WPAM) developed by Davoudian. It considers MCS dependency by modifying conditional probability values using SLI methodology. But we found that the modified conditional probability values in WPAM are somewhat artificial and have no sound basis. WPAM tend to overestimate conditional probability of hardware failure, because it uses SLI methodology which is normally used in Human Reliability Analysis (HRA). WPAM also does not consider uncertainty of risk. In this study, we proposed methodology to incorporate safety culture into PSA quantitatively that can deal with MCS dependency and uncertainty of risk by applying the Common Uncertainty Source (CUS) model developed by Zhang. CUS is uncertainty source that is common to basic events, and this can be physical

  18. Limitation of population's irradiation by natural sources of ionizing radiation

    International Nuclear Information System (INIS)

    Krisyuk, Eh.M.

    1989-01-01

    Review of works devoted to evaluating the human irradiation doses at the expense of the main sources of ionizing radiation, is given. It is shown that the human irradiation doses at the expense of DDP can be reduced 10 times and more. However to realize such measures it is necessary to study the efficiency and determine the cost of various protective activities as well as to develop the criteria of their realization necessity

  19. Quantitative identification of nitrate pollution sources and uncertainty analysis based on dual isotope approach in an agricultural watershed.

    Science.gov (United States)

    Ji, Xiaoliang; Xie, Runting; Hao, Yun; Lu, Jun

    2017-10-01

    Quantitative identification of nitrate (NO 3 - -N) sources is critical to the control of nonpoint source nitrogen pollution in an agricultural watershed. Combined with water quality monitoring, we adopted the environmental isotope (δD-H 2 O, δ 18 O-H 2 O, δ 15 N-NO 3 - , and δ 18 O-NO 3 - ) analysis and the Markov Chain Monte Carlo (MCMC) mixing model to determine the proportions of riverine NO 3 - -N inputs from four potential NO 3 - -N sources, namely, atmospheric deposition (AD), chemical nitrogen fertilizer (NF), soil nitrogen (SN), and manure and sewage (M&S), in the ChangLe River watershed of eastern China. Results showed that NO 3 - -N was the main form of nitrogen in this watershed, accounting for approximately 74% of the total nitrogen concentration. A strong hydraulic interaction existed between the surface and groundwater for NO 3 - -N pollution. The variations of the isotopic composition in NO 3 - -N suggested that microbial nitrification was the dominant nitrogen transformation process in surface water, whereas significant denitrification was observed in groundwater. MCMC mixing model outputs revealed that M&S was the predominant contributor to riverine NO 3 - -N pollution (contributing 41.8% on average), followed by SN (34.0%), NF (21.9%), and AD (2.3%) sources. Finally, we constructed an uncertainty index, UI 90 , to quantitatively characterize the uncertainties inherent in NO 3 - -N source apportionment and discussed the reasons behind the uncertainties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. The potential and limitations of third generation light sources

    International Nuclear Information System (INIS)

    Hormes, Josef

    2011-01-01

    To date, 3rd generation Light Sources, i.e. electron storage rings where mainly radiation from insertion devices (wigglers and undulators) is used for synchrotron radiation experiments are the 'workhorses' for basic and applied VUV/X-ray research. Several machine parameters. i.e. the energy of the electrons, the emittance and the circumference of the machine, together with the specification of the corresponding insertion devices determine the 'quality' of a facility and a specific beamline. In this talk, several of these aspects are discussed mainly from a users' point of view, i.e. what are the required specifications to carry out 'state-of-the-art' experiments in various areas, e.g. protein crystallography, Resonant Elastic and Inelastic X-ray Scattering (REIXS), Micro-/nanospectroscopy, and time resolved experiments in the femtosecond time domain. (author)

  1. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  2. Autonomous choices among deterministic evolution-laws as source of uncertainty

    Science.gov (United States)

    Trujillo, Leonardo; Meyroneinc, Arnaud; Campos, Kilver; Rendón, Otto; Sigalotti, Leonardo Di G.

    2018-03-01

    We provide evidence of an extreme form of sensitivity to initial conditions in a family of one-dimensional self-ruling dynamical systems. We prove that some hyperchaotic sequences are closed-form expressions of the orbits of these pseudo-random dynamical systems. Each chaotic system in this family exhibits a sensitivity to initial conditions that encompasses the sequence of choices of the evolution rule in some collection of maps. This opens a possibility to extend current theories of complex behaviors on the basis of intrinsic uncertainty in deterministic chaos.

  3. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    DEFF Research Database (Denmark)

    Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John

    and the hydraulic gradient across the control plane and are consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox...... transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. Tests show that the decoupled approach is both efficient and able to provide accurate uncertainty...

  4. Uncertainty sources in radiopharmaceuticals clinical studies; Fontes de incertezas em estudos clinicos com radiofarmacos

    Energy Technology Data Exchange (ETDEWEB)

    Degenhardt, Aemilie Louize; Oliveira, Silvia Maria Velasques de, E-mail: silvia@cnen.gov.br, E-mail: amilie@bolsista.ird.gov.br [Instituto de Radioprotecao e Dosimetria, (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    The radiopharmaceuticals should be approved for consumption by evaluating their quality, safety and efficacy. Clinical studies are designed to verify the pharmacodynamics, pharmacological and clinical effects in humans and are required for assuring safety and efficacy. The Bayesian analysis has been used for clinical studies effectiveness evaluation. This work aims to identify uncertainties associated with the process of production of the radionuclide and radiopharmaceutical labelling as well as the radiopharmaceutical administration and scintigraphy images acquisition and processing. For the development of clinical studies in the country, the metrological chain shall assure the traceability of the surveys performed in all phases. (author)

  5. Application of probability distributions for quantifying uncertainty in radionuclide source terms for Seabrook risk assessment

    International Nuclear Information System (INIS)

    Walker, D.H.; Savin, N.L.

    1985-01-01

    The calculational models developed for the Reactor Safety Study (RSS) have traditionally been used to generate 'point estimate values' for radionuclide release to the environment for nuclear power plant risk assessments. The point estimate values so calculated are acknowledged by most knowledgeable individuals to be conservatively high. Further, recent evaluations of the overall uncertainties in the various components that make up risk estimates for nuclear electric generating stations show that one of the large uncertainties is associated with the magnitude of the radionuclide release to the environment. In the approach developed for the RSS, values for fission product release from the fuel are derived from data obtained from small experiments. A reappraisal of the RSS release fractions was published in 1981 in NUREG-0772. Estimates of fractional releases from fuel are similar to those of the RSS. In the RSS approach, depletion during transport from the core (where the fission products are released) to the containment is assumed to be zero for calculation purposes. In the containment, the CORRAL code is applied to calculate radioactivity depletion by containment processes and to calculate the quantity and timing of release to the environment

  6. Renewable energy sources project appraisal under uncertainty: the case of wind energy exploitation within a changing energy market environment

    International Nuclear Information System (INIS)

    Venetsanos, K.; Angelopoulou, P.; Tsoutsos, T.

    2002-01-01

    There are four elements, which contribute to the oncoming increase of electricity demand: climate changes, the expected growth rates of EU Member State economies, changes in the consumption patterns and the introduction of new technologies. The new deregulated Electricity Market is expected to respond to this challenge and the energy supply will be adequate and cost effective within this new environment which offers promising opportunities for power producers both existing and newcomers. In this paper a framework for the appraisal of power projects under uncertainty within a competitive market environment is identified, focusing on the electricity from Renewable Energy Sources. To this end the wind energy-to-electricity, production in Greece will serve as a case study. The subject matter is centred on the following areas: the uncertainties within the new deregulated energy market; the evaluation methods including an analysis of the introduced uncertainties after deregulation and a new approach to project evaluation using the real options, as well as comparison of the valuation methodologies within the new environment drawing from the case for Greece. (author)

  7. Sources of errors and uncertainties in the assessment of forest soil carbon stocks at different scales

    DEFF Research Database (Denmark)

    Vanguelova, E. I.; Bonifacio, E.; De Vos, B.

    2016-01-01

    temporal changes and spatial differences in SOC. This requires sufficiently detailed data to predict SOC stocks at appropriate scales within the required accuracy so that only significant changes are accounted for. When designing sampling campaigns, taking into account factors influencing SOC spatial...... and temporal distribution (such as soil type, topography, climate and vegetation) are needed to optimise sampling depths and numbers of samples, thereby ensuring that samples accurately reflect the distribution of SOC at a site. Furthermore, the appropriate scales related to the research question need...... to be defined: profile, plot, forests, catchment, national or wider. Scaling up SOC stocks from point sample to landscape unit is challenging, and thus requires reliable baseline data. Knowledge of the associated uncertainties related to SOC measures at each particular scale and how to reduce them is crucial...

  8. Uncertainty about the true source. A note on the likelihood ratio at the activity level.

    Science.gov (United States)

    Taroni, Franco; Biedermann, Alex; Bozza, Silvia; Comte, Jennifer; Garbolino, Paolo

    2012-07-10

    This paper focuses on likelihood ratio based evaluations of fibre evidence in cases in which there is uncertainty about whether or not the reference item available for analysis - that is, an item typically taken from the suspect or seized at his home - is the item actually worn at the time of the offence. A likelihood ratio approach is proposed that, for situations in which certain categorical assumptions can be made about additionally introduced parameters, converges to formula described in existing literature. The properties of the proposed likelihood ratio approach are analysed through sensitivity analyses and discussed with respect to possible argumentative implications that arise in practice. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. Reliability of Coulomb stress changes inferred from correlated uncertainties of finite-fault source models

    KAUST Repository

    Woessner, J.; Jonsson, Sigurjon; Sudhaus, H.; Baumann, C.

    2012-01-01

    Static stress transfer is one physical mechanism to explain triggered seismicity. Coseismic stress-change calculations strongly depend on the parameterization of the causative finite-fault source model. These models are uncertain due

  10. FIRST EXPERIMENTAL RESULTS FROM DEGAS, THE QUANTUM LIMITED BRIGHTNESS ELECTRON SOURCE

    International Nuclear Information System (INIS)

    Zolotorev, Max S.; Commins, Eugene D.; Oneill, James; Sannibale, Fernando; Tremsin, Anton; Wan, Weishi

    2008-01-01

    The construction of DEGAS (DEGenerate Advanced Source), a proof of principle for a quantum limited brightness electron source, has been completed at the Lawrence Berkeley National Laboratory. The commissioning and the characterization of this source, designed to generate coherent single electron 'bunches' with brightness approaching the quantum limit at a repetition rate of few MHz, has been started. In this paper the first experimental results are described

  11. Carbon source-sink limitations differ between two species with contrasting growth strategies.

    Science.gov (United States)

    Burnett, Angela C; Rogers, Alistair; Rees, Mark; Osborne, Colin P

    2016-11-01

    Understanding how carbon source and sink strengths limit plant growth is a critical knowledge gap that hinders efforts to maximize crop yield. We investigated how differences in growth rate arise from source-sink limitations, using a model system comparing a fast-growing domesticated annual barley (Hordeum vulgare cv. NFC Tipple) with a slow-growing wild perennial relative (Hordeum bulbosum). Source strength was manipulated by growing plants at sub-ambient and elevated CO 2 concentrations ([CO 2 ]). Limitations on vegetative growth imposed by source and sink were diagnosed by measuring relative growth rate, developmental plasticity, photosynthesis and major carbon and nitrogen metabolite pools. Growth was sink limited in the annual but source limited in the perennial. RGR and carbon acquisition were higher in the annual, but photosynthesis responded weakly to elevated [CO 2 ] indicating that source strength was near maximal at current [CO 2 ]. In contrast, photosynthetic rate and sink development responded strongly to elevated [CO 2 ] in the perennial, indicating significant source limitation. Sink limitation was avoided in the perennial by high sink plasticity: a marked increase in tillering and root:shoot ratio at elevated [CO 2 ], and lower non-structural carbohydrate accumulation. Alleviating sink limitation during vegetative development could be important for maximizing growth of elite cereals under future elevated [CO 2 ]. © 2016 John Wiley & Sons Ltd.

  12. Upper limits on the total cosmic-ray luminosity of individual sources

    Energy Technology Data Exchange (ETDEWEB)

    Anjos, R.C.; De Souza, V. [Instituto de Física de São Carlos, Universidade de São Paulo, São Paulo (Brazil); Supanitsky, A.D., E-mail: rita@ifsc.usp.br, E-mail: vitor@ifsc.usp.br, E-mail: supanitsky@iafe.uba.ar [Instituto de Astronomía y Física del Espacio (IAFE), CONICET-UBA, Buenos Aires (Argentina)

    2014-07-01

    In this paper, upper limits on the total luminosity of ultra-high-energy cosmic-rays (UHECR) E > 10{sup 18} eV) are determined for five individual sources. The upper limit on the integral flux of GeV--TeV gamma-rays is used to extract the upper limit on the total UHECR luminosity of individual sources. The correlation between upper limit on the integral GeV--TeV gamma-ray flux and upper limit on the UHECR luminosity is established through the cascading process that takes place during propagation of the cosmic rays in the background radiation fields, as explained in reference [1]. Twenty-eight sources measured by FERMI-LAT, VERITAS and MAGIC observatories have been studied. The measured upper limit on the GeV--TeV gamma-ray flux is restrictive enough to allow the calculation of an upper limit on the total UHECR cosmic-ray luminosity of five sources. The upper limit on the UHECR cosmic-ray luminosity of these sources is shown for several assumptions on the emission mechanism. For all studied sources an upper limit on the ultra-high-energy proton luminosity is also set.

  13. Upper limits on the total cosmic-ray luminosity of individual sources

    International Nuclear Information System (INIS)

    Anjos, R.C.; De Souza, V.; Supanitsky, A.D.

    2014-01-01

    In this paper, upper limits on the total luminosity of ultra-high-energy cosmic-rays (UHECR) E > 10 18 eV) are determined for five individual sources. The upper limit on the integral flux of GeV--TeV gamma-rays is used to extract the upper limit on the total UHECR luminosity of individual sources. The correlation between upper limit on the integral GeV--TeV gamma-ray flux and upper limit on the UHECR luminosity is established through the cascading process that takes place during propagation of the cosmic rays in the background radiation fields, as explained in reference [1]. Twenty-eight sources measured by FERMI-LAT, VERITAS and MAGIC observatories have been studied. The measured upper limit on the GeV--TeV gamma-ray flux is restrictive enough to allow the calculation of an upper limit on the total UHECR cosmic-ray luminosity of five sources. The upper limit on the UHECR cosmic-ray luminosity of these sources is shown for several assumptions on the emission mechanism. For all studied sources an upper limit on the ultra-high-energy proton luminosity is also set

  14. Precipitation observations for operational flood forecasting in Scotland: Data availability, limitations and the impact of observational uncertainty

    Science.gov (United States)

    Parry, Louise; Neely, Ryan, III; Bennett, Lindsay; Collier, Chris; Dufton, David

    2017-04-01

    The Scottish Environment Protection Agency (SEPA) has a statutory responsibility to provide flood warning across Scotland. It achieves this through an operational partnership with the UK Met Office wherein meteorological forecasts are applied to a national distributed hydrological model, Grid- to- Grid (G2G), and catchment specific lumped PDM models. Both of these model types rely on observed precipitation input for model development and calibration, and operationally for historical runs to generate initial conditions. Scotland has an average annual precipitation of 1430mm per annum (1971-2000), but the spatial variability in totals is high, predominantly in relation to the topography and prevailing winds, which poses different challenges to both radar and point measurement methods of observation. In addition, the high elevations mean that in winter a significant proportion of precipitation falls as snow. For the operational forecasting models, observed rainfall data is provided in Near Real Time (NRT) from SEPA's network of approximately 260 telemetered TBR gauges and 4 UK Met Office C-band radars. Both data sources have their strengths and weaknesses, particularly in relation to the orography and spatial representativeness, but estimates of rainfall from the two methods can vary greatly. Northern Scotland, particularly near Inverness, is a comparatively sparse part of the radar network. Rainfall totals and distribution in this area are determined by the Northern Western Highlands and Cairngorms mountain ranges, which also have a negative impact on radar observations. In recognition of this issue, the NCAS mobile X-band weather radar (MXWR) was deployed in this area between February and August 2016. This study presents a comparison of rainfall estimates for the Inverness and Moray Firth region generated from the operational radar network, the TBR network, and the MXWR. Quantitative precipitation estimates (QPEs) from both sources of radar data were compared to

  15. Projecting the potential evapotranspiration by coupling different formulations and input data reliabilities: The possible uncertainty source for climate change impacts on hydrological regime

    Science.gov (United States)

    Wang, Weiguang; Li, Changni; Xing, Wanqiu; Fu, Jianyu

    2017-12-01

    Representing atmospheric evaporating capability for a hypothetical reference surface, potential evapotranspiration (PET) determines the upper limit of actual evapotranspiration and is an important input to hydrological models. Due that present climate models do not give direct estimates of PET when simulating the hydrological response to future climate change, the PET must be estimated first and is subject to the uncertainty on account of many existing formulae and different input data reliabilities. Using four different PET estimation approaches, i.e., the more physically Penman (PN) equation with less reliable input variables, more empirical radiation-based Priestley-Taylor (PT) equation with relatively dependable downscaled data, the most simply temperature-based Hamon (HM) equation with the most reliable downscaled variable, and downscaling PET directly by the statistical downscaling model, this paper investigated the differences of runoff projection caused by the alternative PET methods by a well calibrated abcd monthly hydrological model. Three catchments, i.e., the Luanhe River Basin, the Source Region of the Yellow River and the Ganjiang River Basin, representing a large climatic diversity were chosen as examples to illustrate this issue. The results indicated that although similar monthly patterns of PET over the period 2021-2050 for each catchment were provided by the four methods, the magnitudes of PET were still slightly different, especially for spring and summer months in the Luanhe River Basin and the Source Region of the Yellow River with relatively dry climate feature. The apparent discrepancy in magnitude of change in future runoff and even the diverse change direction for summer months in the Luanhe River Basin and spring months in the Source Region of the Yellow River indicated that the PET method related uncertainty occurred, especially in the Luanhe River Basin and the Source Region of the Yellow River with smaller aridity index. Moreover, the

  16. Uncertainty Source of Modeled Ecosystem Productivity in East Asian Monsoon Region: A Traceability Analysis

    Science.gov (United States)

    Cui, E.; Xia, J.; Huang, K.; Ito, A.; Arain, M. A.; Jain, A. K.; Poulter, B.; Peng, C.; Hayes, D. J.; Ricciuto, D. M.; Huntzinger, D. N.; Tian, H.; Mao, J.; Fisher, J.; Schaefer, K. M.; Huang, M.; Peng, S.; Wang, W.

    2017-12-01

    East Asian monsoon region, benefits from sufficient water-heat availability and increasing nitrogen deposition, represents significantly higher net ecosystem productivity than the same latitudes of Europe-Africa and North America. A better understanding of major contributions to the uncertainties of terrestrial carbon cycle in this region is greatly important for evaluating the global carbon balance. This study analyzed the key carbon processes and parameters derived from a series of terrestrial biosphere models. A wide range of inter-model disagreement on GPP was found in China's subtropical regions. Then, this large difference was traced to a few traceable components included in terrestrial carbon cycle. The increase in ensemble mean GPP over 1901-2010 was predominantly resulted from increasing atmospheric CO2 concentration and nitrogen deposition, while high frequent land-use change over this region showed a slightly negative effect on GPP. However, inter-model differences of GPP were mainly attributed to the baseline simulations without changes in external forcing. According to the variance decomposition, the large spread in simulated GPP was well explained by the differences in leaf area index (LAI) and specific leaf area (SLA) among models. In addition, the underlying errors in simulated GPP propagate through the model and introduce some additional errors to the simulation of NPP and biomass. By comparing the simulations with satellite-derived, data-oriented and observation-based datasets, we further found that GPP, vegetation carbon turn-over time, aboveground biomass, LAI and SLA were all overestimated in most of the models while biomass distribution in leaves was significantly underestimated. The results of this study indicate that model performance on ecosystem productivity in East Asian monsoon region can be improved by a more realistic representation of leaf functional traits.

  17. Methane Flux Estimation from Point Sources using GOSAT Target Observation: Detection Limit and Improvements with Next Generation Instruments

    Science.gov (United States)

    Kuze, A.; Suto, H.; Kataoka, F.; Shiomi, K.; Kondo, Y.; Crisp, D.; Butz, A.

    2017-12-01

    Atmospheric methane (CH4) has an important role in global radiative forcing of climate but its emission estimates have larger uncertainties than carbon dioxide (CO2). The area of anthropogenic emission sources is usually much smaller than 100 km2. The Thermal And Near infrared Sensor for carbon Observation Fourier-Transform Spectrometer (TANSO-FTS) onboard the Greenhouse gases Observing SATellite (GOSAT) has measured CO2 and CH4 column density using sun light reflected from the earth's surface. It has an agile pointing system and its footprint can cover 87-km2 with a single detector. By specifying pointing angles and observation time for every orbit, TANSO-FTS can target various CH4 point sources together with reference points every 3 day over years. We selected a reference point that represents CH4 background density before or after targeting a point source. By combining satellite-measured enhancement of the CH4 column density and surface measured wind data or estimates from the Weather Research and Forecasting (WRF) model, we estimated CH4emission amounts. Here, we picked up two sites in the US West Coast, where clear sky frequency is high and a series of data are available. The natural gas leak at Aliso Canyon showed a large enhancement and its decrease with time since the initial blowout. We present time series of flux estimation assuming the source is single point without influx. The observation of the cattle feedlot in Chino, California has weather station within the TANSO-FTS footprint. The wind speed is monitored continuously and the wind direction is stable at the time of GOSAT overpass. The large TANSO-FTS footprint and strong wind decreases enhancement below noise level. Weak wind shows enhancements in CH4, but the velocity data have large uncertainties. We show the detection limit of single samples and how to reduce uncertainty using time series of satellite data. We will propose that the next generation instruments for accurate anthropogenic CO2 and CH

  18. Design and Optimisation Strategies of Nonlinear Dynamics for Diffraction Limited Synchrotron Light Source

    CERN Document Server

    Bartolini, R.

    2016-01-01

    This paper introduces the most recent achievements in the control of nonlinear dynamics in electron synchrotron light sources, with special attention to diffraction limited storage rings. Guidelines for the design and optimization of the magnetic lattice are reviewed and discussed.

  19. Are gas exchange responses to resource limitation and defoliation linked to source:sink relationships?

    Science.gov (United States)

    Pinkard, E A; Eyles, A; O'Grady, A P

    2011-10-01

    Productivity of trees can be affected by limitations in resources such as water and nutrients, and herbivory. However, there is little understanding of their interactive effects on carbon uptake and growth. We hypothesized that: (1) in the absence of defoliation, photosynthetic rate and leaf respiration would be governed by limiting resource(s) and their impact on sink limitation; (2) photosynthetic responses to defoliation would be a consequence of changing source:sink relationships and increased availability of limiting resources; and (3) photosynthesis and leaf respiration would be adjusted in response to limiting resources and defoliation so that growth could be maintained. We tested these hypotheses by examining how leaf photosynthetic processes, respiration, carbohydrate concentrations and growth rates of Eucalyptus globulus were influenced by high or low water and nitrogen (N) availability, and/or defoliation. Photosynthesis of saplings grown with low water was primarily sink limited, whereas photosynthetic responses of saplings grown with low N were suggestive of source limitation. Defoliation resulted in source limitation. Net photosynthetic responses to defoliation were linked to the degree of resource availability, with the largest responses measured in treatments where saplings were ultimately source rather than sink limited. There was good evidence of acclimation to stress, enabling higher rates of C uptake than might otherwise have occurred. © 2011 Blackwell Publishing Ltd.

  20. Using a Bayesian Probabilistic Forecasting Model to Analyze the Uncertainty in Real-Time Dynamic Control of the Flood Limiting Water Level for Reservoir Operation

    DEFF Research Database (Denmark)

    Liu, Dedi; Li, Xiang; Guo, Shenglian

    2015-01-01

    Dynamic control of the flood limiting water level (FLWL) is a valuable and effective way to maximize the benefits from reservoir operation without exceeding the design risk. In order to analyze the impacts of input uncertainty, a Bayesian forecasting system (BFS) is adopted. Applying quantile water...... inflow values and their uncertainties obtained from the BFS, the reservoir operation results from different schemes can be analyzed in terms of benefits, dam safety, and downstream impacts during the flood season. When the reservoir FLWL dynamic control operation is implemented, there are two fundamental......, also deterministic water inflow was tested. The proposed model in the paper emphasizes the importance of analyzing the uncertainties of the water inflow forecasting system for real-time dynamic control of the FLWL for reservoir operation. For the case study, the selected quantile inflow from...

  1. Analysis of Paralleling Limited Capacity Voltage Sources by Projective Geometry Method

    Directory of Open Access Journals (Sweden)

    Alexandr Penin

    2014-01-01

    Full Text Available The droop current-sharing method for voltage sources of a limited capacity is considered. Influence of equalizing resistors and load resistor is investigated on uniform distribution of relative values of currents when the actual loading corresponds to the capacity of a concrete source. Novel concepts for quantitative representation of operating regimes of sources are entered with use of projective geometry method.

  2. Uncertainties in source term estimates for a station blackout accident in a BWR with Mark I containment

    International Nuclear Information System (INIS)

    Lee, M.; Cazzoli, E.; Liu, Y.; Davis, R.; Nourbakhsh, H.; Schmidt, E.; Unwin, S.; Khatib-Rahbar, M.

    1988-01-01

    In this paper, attention is limited to a single accident progression sequence, namly a station blackout accident in a BWR with a Mark I containment building. Identified as an important accident in the draft version of NUREG-1150 a station blackout involves loss of both off-site power and dc power resulting in failure of the diesels to start and in the unavailability of the high pressure injection and core isolation cooling systems. This paper illustrates the calculated uncertainties (Probability Density Functions) associated with the radiological releases into the environment for the nine fission product groups at 10 hours following the initiation of core-concrete interactions. Also shown are the results ofthe STCP base case simulation. 5 refs., 1 fig., 1 tab

  3. Confusion-limited extragalactic source survey at 4.755 GHz. I. Source list and areal distributions

    International Nuclear Information System (INIS)

    Ledden, J.E.; Broderick, J.J.; Condon, J.J.; Brown, R.L.

    1980-01-01

    A confusion-limited 4.755-GHz survey covering 0.00 956 sr between right ascensions 07/sup h/05/sup m/ and 18/sup h/ near declination +35 0 has been made with the NRAO 91-m telescope. The survey found 237 sources and is complete above 15 mJy. Source counts between 15 and 100 mJy were obtained directly. The P(D) distribution was used to determine the number counts between 0.5 and 13.2 mJy, to search for anisotropy in the density of faint extragalactic sources, and to set a 99%-confidence upper limit of 1.83 mK to the rms temperature fluctuation of the 2.7-K cosmic microwave background on angular scales smaller than 7.3 arcmin. The discrete-source density, normalized to the static Euclidean slope, falls off sufficiently rapidly below 100 mJy that no new population of faint flat-spectrum sources is required to explain the 4.755-GHz source counts

  4. 40 CFR Table 1 to Subpart Xxxx of... - Emission Limits for Tire Production Affected Sources

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Emission Limits for Tire Production Affected Sources 1 Table 1 to Subpart XXXX of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION.... 63, Subpt. XXXX, Table 1 Table 1 to Subpart XXXX of Part 63—Emission Limits for Tire Production...

  5. 40 CFR Table 3 to Subpart Xxxx of... - Emission Limits for Puncture Sealant Application Affected Sources

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Emission Limits for Puncture Sealant Application Affected Sources 3 Table 3 to Subpart XXXX of Part 63 Protection of Environment ENVIRONMENTAL... Manufacturing Pt. 63, Subpt. XXXX, Table 3 Table 3 to Subpart XXXX of Part 63—Emission Limits for Puncture...

  6. 40 CFR Table 2 to Subpart Xxxx of... - Emission Limits for Tire Cord Production Affected Sources

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Emission Limits for Tire Cord Production Affected Sources 2 Table 2 to Subpart XXXX of Part 63 Protection of Environment ENVIRONMENTAL... Manufacturing Pt. 63, Subpt. XXXX, Table 2 Table 2 to Subpart XXXX of Part 63—Emission Limits for Tire Cord...

  7. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  8. Uncertainty quantification and experimental design based on unsupervised machine learning identification of contaminant sources and groundwater types using hydrogeochemical data

    Science.gov (United States)

    Vesselinov, V. V.

    2017-12-01

    Identification of the original groundwater types present in geochemical mixtures observed in an aquifer is a challenging but very important task. Frequently, some of the groundwater types are related to different infiltration and/or contamination sources associated with various geochemical signatures and origins. The characterization of groundwater mixing processes typically requires solving complex inverse models representing groundwater flow and geochemical transport in the aquifer, where the inverse analysis accounts for available site data. Usually, the model is calibrated against the available data characterizing the spatial and temporal distribution of the observed geochemical species. Numerous geochemical constituents and processes may need to be simulated in these models which further complicates the analyses. As a result, these types of model analyses are typically extremely challenging. Here, we demonstrate a new contaminant source identification approach that performs decomposition of the observation mixtures based on Nonnegative Matrix Factorization (NMF) method for Blind Source Separation (BSS), coupled with a custom semi-supervised clustering algorithm. Our methodology, called NMFk, is capable of identifying (a) the number of groundwater types and (b) the original geochemical concentration of the contaminant sources from measured geochemical mixtures with unknown mixing ratios without any additional site information. We also demonstrate how NMFk can be extended to perform uncertainty quantification and experimental design related to real-world site characterization. The NMFk algorithm works with geochemical data represented in the form of concentrations, ratios (of two constituents; for example, isotope ratios), and delta notations (standard normalized stable isotope ratios). The NMFk algorithm has been extensively tested on synthetic datasets; NMFk analyses have been actively performed on real-world data collected at the Los Alamos National

  9. Comparative risk-benefit-cost effectiveness in nuclear and alternate power sources: methodology, perspective, limitations

    International Nuclear Information System (INIS)

    Vinck, W.; Van Reijen, G.; Maurer, H.; Volta, G.

    1980-01-01

    A critical survey is given of the use of quantitative risk assessment in defining acceptable limits of safety and of its use together with cost-benefit analyses for decision making. The paper indicates uncertainties and even unknowns in risk assessment in particular if the whole fuel cycle for energy production is considered. It is made clear that for decisions on acceptance of risk also the risk perception factor must be considered. A difficult issue here is the potential for low-probability/large consequence accidents. Examples are given, suggestions for improvement are made and perspectives are outlined

  10. Limited Impact of Setup and Range Uncertainties, Breathing Motion, and Interplay Effects in Robustly Optimized Intensity Modulated Proton Therapy for Stage III Non-small Cell Lung Cancer

    International Nuclear Information System (INIS)

    Inoue, Tatsuya; Widder, Joachim; Dijk, Lisanne V. van; Takegawa, Hideki; Koizumi, Masahiko; Takashina, Masaaki; Usui, Keisuke; Kurokawa, Chie; Sugimoto, Satoru; Saito, Anneyuko I.; Sasai, Keisuke; Veld, Aart A. van't; Langendijk, Johannes A.; Korevaar, Erik W.

    2016-01-01

    Purpose: To investigate the impact of setup and range uncertainties, breathing motion, and interplay effects using scanning pencil beams in robustly optimized intensity modulated proton therapy (IMPT) for stage III non-small cell lung cancer (NSCLC). Methods and Materials: Three-field IMPT plans were created using a minimax robust optimization technique for 10 NSCLC patients. The plans accounted for 5- or 7-mm setup errors with ±3% range uncertainties. The robustness of the IMPT nominal plans was evaluated considering (1) isotropic 5-mm setup errors with ±3% range uncertainties; (2) breathing motion; (3) interplay effects; and (4) a combination of items 1 and 2. The plans were calculated using 4-dimensional and average intensity projection computed tomography images. The target coverage (TC, volume receiving 95% of prescribed dose) and homogeneity index (D_2 − D_9_8, where D_2 and D_9_8 are the least doses received by 2% and 98% of the volume) for the internal clinical target volume, and dose indexes for lung, esophagus, heart and spinal cord were compared with that of clinical volumetric modulated arc therapy plans. Results: The TC and homogeneity index for all plans were within clinical limits when considering the breathing motion and interplay effects independently. The setup and range uncertainties had a larger effect when considering their combined effect. The TC decreased to 98% for robust 7-mm evaluations for all patients. The organ at risk dose parameters did not significantly vary between the respective robust 5-mm and robust 7-mm evaluations for the 4 error types. Compared with the volumetric modulated arc therapy plans, the IMPT plans showed better target homogeneity and mean lung and heart dose parameters reduced by about 40% and 60%, respectively. Conclusions: In robustly optimized IMPT for stage III NSCLC, the setup and range uncertainties, breathing motion, and interplay effects have limited impact on target coverage, dose homogeneity, and

  11. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  12. Limits on the space density of gamma-ray burst sources

    International Nuclear Information System (INIS)

    Epstein, R.I.

    1985-01-01

    Gamma-ray burst spectra which extend to several MeV without significant steepening indicate that there is negligible degradation due to two-photon pair production. The inferred low rate of photon-photon reactions is used to give upper limits to the distances to the sources and to the intensity of the radiation from the sources. These limits are calculated under the assumptions that the bursters are neutron stars which emit uncollimated gamma rays. The principal results are that the space density of the gamma-ray burst sources exceeds approx.10 -6 pc -3 if the entire surface of the neutron star radiates and exceeds approx.10 -3 pc -3 if only a small cap or thin strip in the stellar surface radiates. In the former case the density of gamma-ray bursters is approx.1% of the inferred density of extinct pulsars, and in the latter case the mean mass density of burster sources is a few percent of the density of unidentified dark matter in the solar neighborhood. In both cases the X-ray intensity of the sources is far below the Rayleigh-Jeans limit, and the total flux is at most comparable to the Eddington limit. This implies that low-energy self-absorption near 10 keV is entirely negligible and that radiation-driven explosions are just barely possible

  13. Long-term chemical analysis and organic aerosol source apportionment at nine sites in central Europe: source identification and uncertainty assessment

    Science.gov (United States)

    Daellenbach, Kaspar R.; Stefenelli, Giulia; Bozzetti, Carlo; Vlachou, Athanasia; Fermo, Paola; Gonzalez, Raquel; Piazzalunga, Andrea; Colombi, Cristina; Canonaco, Francesco; Hueglin, Christoph; Kasper-Giebl, Anne; Jaffrezo, Jean-Luc; Bianchi, Federico; Slowik, Jay G.; Baltensperger, Urs; El-Haddad, Imad; Prévôt, André S. H.

    2017-11-01

    Long-term monitoring of organic aerosol is important for epidemiological studies, validation of atmospheric models, and air quality management. In this study, we apply a recently developed filter-based offline methodology using an aerosol mass spectrometer (AMS) to investigate the regional and seasonal differences of contributing organic aerosol sources. We present offline AMS measurements for particulate matter smaller than 10 µm at nine stations in central Europe with different exposure characteristics for the entire year of 2013 (819 samples). The focus of this study is a detailed source apportionment analysis (using positive matrix factorization, PMF) including in-depth assessment of the related uncertainties. Primary organic aerosol (POA) is separated in three components: hydrocarbon-like OA related to traffic emissions (HOA), cooking OA (COA), and biomass burning OA (BBOA). We observe enhanced production of secondary organic aerosol (SOA) in summer, following the increase in biogenic emissions with temperature (summer oxygenated OA, SOOA). In addition, a SOA component was extracted that correlated with an anthropogenic secondary inorganic species that is dominant in winter (winter oxygenated OA, WOOA). A factor (sulfur-containing organic, SC-OA) explaining sulfur-containing fragments (CH3SO2+), which has an event-driven temporal behaviour, was also identified. The relative yearly average factor contributions range from 4 to 14 % for HOA, from 3 to 11 % for COA, from 11 to 59 % for BBOA, from 5 to 23 % for SC-OA, from 14 to 27 % for WOOA, and from 15 to 38 % for SOOA. The uncertainty of the relative average factor contribution lies between 2 and 12 % of OA. At the sites north of the alpine crest, the sum of HOA, COA, and BBOA (POA) contributes less to OA (POA / OA = 0.3) than at the southern alpine valley sites (0.6). BBOA is the main contributor to POA with 87 % in alpine valleys and 42 % north of the alpine crest. Furthermore, the influence of primary

  14. Limit of detection of a fiber optics gyroscope using a super luminescent radiation source

    International Nuclear Information System (INIS)

    Sandoval R, G.E.; Nikolaev, V.A.

    2003-01-01

    The main objective of this work is to establish the dependence of characteristics of the fiber optics gyroscope (FOG) with respect to the parameters of the super luminescent emission source based on doped optical fiber with rare earth elements (Super luminescent Fiber Source, SFS), argument the pumping rate election of the SFS to obtain characteristics limits of the FOG sensibility. By using this type of emission source in the FOG is recommend to use the rate when the direction of the pumping signal coincide with the super luminescent signal. The most results are the proposition and argumentation of the SFS election as emission source to be use in the FOG of the phase type. Such a decision allow to increase the characteristics of the FOG sensibility in comparison with the use of luminescent source of semiconductors emission which are extensively used in the present time. The use of emission source of the SFS type allow to come closer to the threshold of the obtained sensibility limit (detection limit) which is determined with the shot noise. (Author)

  15. Limit of detection of a fiber optics gyroscope using a super luminescent radiation source

    CERN Document Server

    Sandoval, G E

    2003-01-01

    The main objective of this work is to establish the dependence of characteristics of the fiber optics gyroscope (FOG) with respect to the parameters of the super luminescent emission source based on doped optical fiber with rare earth elements (Super luminescent Fiber Source, SFS), argument the pumping rate election of the SFS to obtain characteristics limits of the FOG sensibility. By using this type of emission source in the FOG is recommend to use the rate when the direction of the pumping signal coincide with the super luminescent signal. The most results are the proposition and argumentation of the SFS election as emission source to be use in the FOG of the phase type. Such a decision allow to increase the characteristics of the FOG sensibility in comparison with the use of luminescent source of semiconductors emission which are extensively used in the present time. The use of emission source of the SFS type allow to come closer to the threshold of the obtained sensibility limit (detection limit) which i...

  16. Far-Field Superresolution of Thermal Electromagnetic Sources at the Quantum Limit.

    Science.gov (United States)

    Nair, Ranjith; Tsang, Mankei

    2016-11-04

    We obtain the ultimate quantum limit for estimating the transverse separation of two thermal point sources using a given imaging system with limited spatial bandwidth. We show via the quantum Cramér-Rao bound that, contrary to the Rayleigh limit in conventional direct imaging, quantum mechanics does not mandate any loss of precision in estimating even deep sub-Rayleigh separations. We propose two coherent measurement techniques, easily implementable using current linear-optics technology, that approach the quantum limit over an arbitrarily large range of separations. Our bound is valid for arbitrary source strengths, all regions of the electromagnetic spectrum, and for any imaging system with an inversion-symmetric point-spread function. The measurement schemes can be applied to microscopy, optical sensing, and astrometry at all wavelengths.

  17. Structural Optimization of a High-Speed Press Considering Multi-Source Uncertainties Based on a New Heterogeneous TOPSIS

    Directory of Open Access Journals (Sweden)

    Jin Cheng

    2018-01-01

    Full Text Available In order to achieve high punching precision, good operational reliability and low manufacturing cost, the structural optimization of a high-speed press in the presence of a set of available alternatives comprises a heterogeneous multiple-attribute decision-making (HMADM problem involving deviation, fixation, cost and benefit attributes that can be described in various mathematical forms due to the existence of multi-source uncertainties. Such a HMADM problem cannot be easily resolved by existing methods. To overcome this difficulty, a new heterogeneous technique for order preference by similarity to an ideal solution (HTOPSIS is proposed. A new approach to normalization of heterogeneous attributes is proposed by integrating the possibility degree method, relative preference relation and the attribute transformation technique. Expressions for determining positive and negative ideal solutions corresponding to heterogeneous attributes are also developed. Finally, alternative structural configurations are ranked according to their relative closeness coefficients, and the optimal structural configuration can be determined. The validity and effectiveness of the proposed HTOPSIS are demonstrated by a numerical example. The proposed HTOPSIS can also be applied to structural optimization of other complex equipment, because there is no prerequisite of independency among various attributes for its application.

  18. Calibrate the aerial surveying instrument by the limited surface source and the single point source that replace the unlimited surface source

    CERN Document Server

    Lu Cun Heng

    1999-01-01

    It is described that the calculating formula and surveying result is found on the basis of the stacking principle of gamma ray and the feature of hexagonal surface source when the limited surface source replaces the unlimited surface source to calibrate the aerial survey instrument on the ground, and that it is found in the light of the exchanged principle of the gamma ray when the single point source replaces the unlimited surface source to calibrate aerial surveying instrument in the air. Meanwhile through the theoretical analysis, the receiving rate of the crystal bottom and side surfaces is calculated when aerial surveying instrument receives gamma ray. The mathematical expression of the gamma ray decaying following height according to the Jinge function regularity is got. According to this regularity, the absorbing coefficient that air absorbs the gamma ray and the detective efficiency coefficient of the crystal is calculated based on the ground and air measuring value of the bottom surface receiving cou...

  19. Limitations of a convolution method for modeling geometric uncertainties in radiation therapy. I. The effect of shift invariance

    International Nuclear Information System (INIS)

    Craig, Tim; Battista, Jerry; Van Dyk, Jake

    2003-01-01

    Convolution methods have been used to model the effect of geometric uncertainties on dose delivery in radiation therapy. Convolution assumes shift invariance of the dose distribution. Internal inhomogeneities and surface curvature lead to violations of this assumption. The magnitude of the error resulting from violation of shift invariance is not well documented. This issue is addressed by comparing dose distributions calculated using the Convolution method with dose distributions obtained by Direct Simulation. A comparison of conventional Static dose distributions was also made with Direct Simulation. This analysis was performed for phantom geometries and several clinical tumor sites. A modification to the Convolution method to correct for some of the inherent errors is proposed and tested using example phantoms and patients. We refer to this modified method as the Corrected Convolution. The average maximum dose error in the calculated volume (averaged over different beam arrangements in the various phantom examples) was 21% with the Static dose calculation, 9% with Convolution, and reduced to 5% with the Corrected Convolution. The average maximum dose error in the calculated volume (averaged over four clinical examples) was 9% for the Static method, 13% for Convolution, and 3% for Corrected Convolution. While Convolution can provide a superior estimate of the dose delivered when geometric uncertainties are present, the violation of shift invariance can result in substantial errors near the surface of the patient. The proposed Corrected Convolution modification reduces errors near the surface to 3% or less

  20. 40 CFR 63.5985 - What are my alternatives for meeting the emission limits for tire production affected sources?

    Science.gov (United States)

    2010-07-01

    ... the emission limits for tire production affected sources? 63.5985 Section 63.5985 Protection of... Pollutants: Rubber Tire Manufacturing Emission Limits for Tire Production Affected Sources § 63.5985 What are my alternatives for meeting the emission limits for tire production affected sources? You must use...

  1. Itô-SDE MCMC method for Bayesian characterization of errors associated with data limitations in stochastic expansion methods for uncertainty quantification

    Science.gov (United States)

    Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.

    2017-11-01

    This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.

  2. SU-D-210-03: Limited-View Multi-Source Quantitative Photoacoustic Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Feng, J; Gao, H [Shanghai Jiao Tong University, Shanghai, Shanghai (China)

    2015-06-15

    Purpose: This work is to investigate a novel limited-view multi-source acquisition scheme for the direct and simultaneous reconstruction of optical coefficients in quantitative photoacoustic tomography (QPAT), which has potentially improved signal-to-noise ratio and reduced data acquisition time. Methods: Conventional QPAT is often considered in two steps: first to reconstruct the initial acoustic pressure from the full-view ultrasonic data after each optical illumination, and then to quantitatively reconstruct optical coefficients (e.g., absorption and scattering coefficients) from the initial acoustic pressure, using multi-source or multi-wavelength scheme.Based on a novel limited-view multi-source scheme here, We have to consider the direct reconstruction of optical coefficients from the ultrasonic data, since the initial acoustic pressure can no longer be reconstructed as an intermediate variable due to the incomplete acoustic data in the proposed limited-view scheme. In this work, based on a coupled photo-acoustic forward model combining diffusion approximation and wave equation, we develop a limited-memory Quasi-Newton method (LBFGS) for image reconstruction that utilizes the adjoint forward problem for fast computation of gradients. Furthermore, the tensor framelet sparsity is utilized to improve the image reconstruction which is solved by Alternative Direction Method of Multipliers (ADMM). Results: The simulation was performed on a modified Shepp-Logan phantom to validate the feasibility of the proposed limited-view scheme and its corresponding image reconstruction algorithms. Conclusion: A limited-view multi-source QPAT scheme is proposed, i.e., the partial-view acoustic data acquisition accompanying each optical illumination, and then the simultaneous rotations of both optical sources and ultrasonic detectors for next optical illumination. Moreover, LBFGS and ADMM algorithms are developed for the direct reconstruction of optical coefficients from the

  3. Limitations of a convolution method for modeling geometric uncertainties in radiation therapy: the radiobiological dose-per-fraction effect

    International Nuclear Information System (INIS)

    Song, William; Battista, Jerry; Van Dyk, Jake

    2004-01-01

    The convolution method can be used to model the effect of random geometric uncertainties into planned dose distributions used in radiation treatment planning. This is effectively done by linearly adding infinitesimally small doses, each with a particular geometric offset, over an assumed infinite number of fractions. However, this process inherently ignores the radiobiological dose-per-fraction effect since only the summed physical dose distribution is generated. The resultant potential error on predicted radiobiological outcome [quantified in this work with tumor control probability (TCP), equivalent uniform dose (EUD), normal tissue complication probability (NTCP), and generalized equivalent uniform dose (gEUD)] has yet to be thoroughly quantified. In this work, the results of a Monte Carlo simulation of geometric displacements are compared to those of the convolution method for random geometric uncertainties of 0, 1, 2, 3, 4, and 5 mm (standard deviation). The α/β CTV ratios of 0.8, 1.5, 3, 5, and 10 Gy are used to represent the range of radiation responses for different tumors, whereas a single α/β OAR ratio of 3 Gy is used to represent all the organs at risk (OAR). The analysis is performed on a four-field prostate treatment plan of 18 MV x rays. The fraction numbers are varied from 1-50, with isoeffective adjustments of the corresponding dose-per-fractions to maintain a constant tumor control, using the linear-quadratic cell survival model. The average differences in TCP and EUD of the target, and in NTCP and gEUD of the OAR calculated from the convolution and Monte Carlo methods reduced asymptotically as the total fraction number increased, with the differences reaching negligible levels beyond the treatment fraction number of ≥20. The convolution method generally overestimates the radiobiological indices, as compared to the Monte Carlo method, for the target volume, and underestimates those for the OAR. These effects are interconnected and attributed

  4. ''Anomalous'' air showers from point sources: Mass limits and light curves

    International Nuclear Information System (INIS)

    Domokos, G.; Elliott, B.; Kovesi-Domokos, S.

    1993-01-01

    We describe a method to obtain upper limits on the mass of the primaries of air showers associated with point sources. One also obtains the UHE pulse shape of a pulsar if its period is observed in the signal. As an example, we analyze the data obtained during a recent burst of Hercules-X1

  5. Analysis of coupled model uncertainties in source-to-dose modeling of human exposures to ambient air pollution: A PM 2.5 case study

    Science.gov (United States)

    Özkaynak, Halûk; Frey, H. Christopher; Burke, Janet; Pinder, Robert W.

    Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into account linkages and feedbacks. The current state-of-practice for such assessments is to exercise emission, meteorology, air quality, exposure, and dose models separately, and to link them together by using the output of one model as input to the subsequent downstream model. Quantification of variability and uncertainty has been an important topic in the exposure assessment community for a number of years. Variability refers to differences in the value of a quantity (e.g., exposure) over time, space, or among individuals. Uncertainty refers to lack of knowledge regarding the true value of a quantity. An emerging challenge is how to quantify variability and uncertainty in integrated assessments over the source-to-dose continuum by considering contributions from individual as well as linked components. For a case study of fine particulate matter (PM 2.5) in North Carolina during July 2002, we characterize variability and uncertainty associated with each of the individual concentration, exposure and dose models that are linked, and use a conceptual framework to quantify and evaluate the implications of coupled model uncertainties. We find that the resulting overall uncertainties due to combined effects of both variability and uncertainty are smaller (usually by a factor of 3-4) than the crudely multiplied model-specific overall uncertainty ratios. Future research will need to examine the impact of potential dependencies among the model components by conducting a truly coupled modeling analysis.

  6. An export coefficient based inexact fuzzy bi-level multi-objective programming model for the management of agricultural nonpoint source pollution under uncertainty

    Science.gov (United States)

    Cai, Yanpeng; Rong, Qiangqiang; Yang, Zhifeng; Yue, Wencong; Tan, Qian

    2018-02-01

    In this research, an export coefficient based inexact fuzzy bi-level multi-objective programming (EC-IFBLMOP) model was developed through integrating export coefficient model (ECM), interval parameter programming (IPP) and fuzzy parameter programming (FPP) within a bi-level multi-objective programming framework. The proposed EC-IFBLMOP model can effectively deal with the multiple uncertainties expressed as discrete intervals and fuzzy membership functions. Also, the complexities in agricultural systems, such as the cooperation and gaming relationship between the decision makers at different levels, can be fully considered in the model. The developed model was then applied to identify the optimal land use patterns and BMP implementing levels for agricultural nonpoint source (NPS) pollution management in a subcatchment in the upper stream watershed of the Miyun Reservoir in north China. The results of the model showed that the desired optimal land use patterns and implementing levels of best management of practices (BMPs) would be obtained. It is the gaming result between the upper- and lower-level decision makers, when the allowable discharge amounts of NPS pollutants were limited. Moreover, results corresponding to different decision scenarios could provide a set of decision alternatives for the upper- and lower-level decision makers to identify the most appropriate management strategy. The model has a good applicability and can be effectively utilized for agricultural NPS pollution management.

  7. Spatial resolution limits for the localization of noise sources using direct sound mapping

    DEFF Research Database (Denmark)

    Comesana, D. Fernandez; Holland, K. R.; Fernandez Grande, Efren

    2016-01-01

    the relationship between spatial resolution, noise level and geometry. The proposed expressions are validated via simulations and experiments. It is shown that particle velocity mapping yields better results for identifying closely spaced sound sources than sound pressure or sound intensity, especially...... extensively been used for many years to locate sound sources. However, it is not yet well defined when two sources should be regarded as resolved by means of direct sound mapping. This paper derives the limits of the direct representation of sound pressure, particle velocity and sound intensity by exploring......One of the main challenges arising from noise and vibration problems is how to identify the areas of a device, machine or structure that produce significant acoustic excitation, i.e. the localization of main noise sources. The direct visualization of sound, in particular sound intensity, has...

  8. Limits to source counts and cosmic microwave background fluctuations at 10.6 GHz

    International Nuclear Information System (INIS)

    Seielstad, G.A.; Masson, C.R.; Berge, G.L.

    1981-01-01

    We have determined the distribution of deflections due to sky temperature fluctuations at 10.6 GHz. If all the deflections are due to fine structure in the cosmic microwave background, we limit these fluctuations to ΔT/T -4 on an angular scale of 11 arcmin. If, on the other hand, all the deflections are due to confusion among discrete radio sources, the areal density of these sources is calculated for various slopes of the differential source count relationship and for various cutoff flux densities. If, for example, the slope is 2.1 and the cutoff is 10 mJy, we find (0.25--3.3) 10 6 sources sr -1 Jy -1

  9. Synthesis of Directional Sources Using Wave Field Synthesis, Possibilities, and Limitations

    Directory of Open Access Journals (Sweden)

    Corteel E

    2007-01-01

    Full Text Available The synthesis of directional sources using wave field synthesis is described. The proposed formulation relies on an ensemble of elementary directivity functions based on a subset of spherical harmonics. These can be combined to create and manipulate directivity characteristics of the synthesized virtual sources. The WFS formulation introduces artifacts in the synthesized sound field for both ideal and real loudspeakers. These artifacts can be partly compensated for using dedicated equalization techniques. A multichannel equalization technique is shown to provide accurate results thus enabling for the manipulation of directional sources with limited reconstruction artifacts. Applications of directional sources to the control of the direct sound field and the interaction with the listening room are discussed.

  10. Beamspace fast fully adaptive brain source localization for limited data sequences

    International Nuclear Information System (INIS)

    Ravan, Maryam

    2017-01-01

    In the electroencephalogram (EEG) or magnetoencephalogram (MEG) context, brain source localization methods that rely on estimating second order statistics often fail when the observations are taken over a short time interval, especially when the number of electrodes is large. To address this issue, in previous study, we developed a multistage adaptive processing called fast fully adaptive (FFA) approach that can significantly reduce the required sample support while still processing all available degrees of freedom (DOFs). This approach processes the observed data in stages through a decimation procedure. In this study, we introduce a new form of FFA approach called beamspace FFA. We first divide the brain into smaller regions and transform the measured data from the source space to the beamspace in each region. The FFA approach is then applied to the beamspaced data of each region. The goal of this modification is to benefit the correlation sensitivity reduction between sources in different brain regions. To demonstrate the performance of the beamspace FFA approach in the limited data scenario, simulation results with multiple deep and cortical sources as well as experimental results are compared with regular FFA and widely used FINE approaches. Both simulation and experimental results demonstrate that the beamspace FFA method can localize different types of multiple correlated brain sources in low signal to noise ratios more accurately with limited data. (paper)

  11. Limited Impact of Setup and Range Uncertainties, Breathing Motion, and Interplay Effects in Robustly Optimized Intensity Modulated Proton Therapy for Stage III Non-small Cell Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Inoue, Tatsuya [Department of Radiology, Juntendo University Urayasu Hospital, Chiba (Japan); Widder, Joachim; Dijk, Lisanne V. van [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Takegawa, Hideki [Department of Radiation Oncology, Kansai Medical University Hirakata Hospital, Osaka (Japan); Koizumi, Masahiko; Takashina, Masaaki [Department of Medical Physics and Engineering, Osaka University Graduate School of Medicine, Osaka (Japan); Usui, Keisuke; Kurokawa, Chie; Sugimoto, Satoru [Department of Radiation Oncology, Juntendo University Graduate School of Medicine, Tokyo (Japan); Saito, Anneyuko I. [Department of Radiology, Juntendo University Urayasu Hospital, Chiba (Japan); Department of Radiation Oncology, Juntendo University Graduate School of Medicine, Tokyo (Japan); Sasai, Keisuke [Department of Radiation Oncology, Juntendo University Graduate School of Medicine, Tokyo (Japan); Veld, Aart A. van' t; Langendijk, Johannes A. [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands); Korevaar, Erik W., E-mail: e.w.korevaar@umcg.nl [Department of Radiation Oncology, University of Groningen, University Medical Center Groningen, Groningen (Netherlands)

    2016-11-01

    Purpose: To investigate the impact of setup and range uncertainties, breathing motion, and interplay effects using scanning pencil beams in robustly optimized intensity modulated proton therapy (IMPT) for stage III non-small cell lung cancer (NSCLC). Methods and Materials: Three-field IMPT plans were created using a minimax robust optimization technique for 10 NSCLC patients. The plans accounted for 5- or 7-mm setup errors with ±3% range uncertainties. The robustness of the IMPT nominal plans was evaluated considering (1) isotropic 5-mm setup errors with ±3% range uncertainties; (2) breathing motion; (3) interplay effects; and (4) a combination of items 1 and 2. The plans were calculated using 4-dimensional and average intensity projection computed tomography images. The target coverage (TC, volume receiving 95% of prescribed dose) and homogeneity index (D{sub 2} − D{sub 98}, where D{sub 2} and D{sub 98} are the least doses received by 2% and 98% of the volume) for the internal clinical target volume, and dose indexes for lung, esophagus, heart and spinal cord were compared with that of clinical volumetric modulated arc therapy plans. Results: The TC and homogeneity index for all plans were within clinical limits when considering the breathing motion and interplay effects independently. The setup and range uncertainties had a larger effect when considering their combined effect. The TC decreased to <98% (clinical threshold) in 3 of 10 patients for robust 5-mm evaluations. However, the TC remained >98% for robust 7-mm evaluations for all patients. The organ at risk dose parameters did not significantly vary between the respective robust 5-mm and robust 7-mm evaluations for the 4 error types. Compared with the volumetric modulated arc therapy plans, the IMPT plans showed better target homogeneity and mean lung and heart dose parameters reduced by about 40% and 60%, respectively. Conclusions: In robustly optimized IMPT for stage III NSCLC, the setup and range

  12. Mass Transfer Limited Enhanced Bioremediation at Dnapl Source Zones: a Numerical Study

    Science.gov (United States)

    Kokkinaki, A.; Sleep, B. E.

    2011-12-01

    The success of enhanced bioremediation of dense non-aqueous phase liquids (DNAPLs) relies on accelerating contaminant mass transfer from the organic to the aqueous phase, thus enhancing the depletion of DNAPL source zones compared to natural dissolution. This is achieved by promoting biological activity that reduces the contaminant's aqueous phase concentration. Although laboratory studies have demonstrated that high reaction rates are attainable by specialized microbial cultures in DNAPL source zones, field applications of the technology report lower reaction rates and prolonged remediation times. One possible explanation for this phenomenon is that the reaction rates are limited by the rate at which the contaminant partitions from the DNAPL to the aqueous phase. In such cases, slow mass transfer to the aqueous phase reduces the bioavailability of the contaminant and consequently decreases the potential source zone depletion enhancement. In this work, the effect of rate limited mass transfer on bio-enhanced dissolution of DNAPL chlorinated ethenes is investigated through a numerical study. A multi-phase, multi-component groundwater transport model is employed to simulate DNAPL mass depletion for a range of source zone scenarios. Rate limited mass transfer is modeled by a linear driving force model, employing a thermodynamic approach for the calculation of the DNAPL - water interfacial area. Metabolic reductive dechlorination is modeled by Monod kinetics, considering microbial growth and self-inhibition. The model was utilized to identify conditions in which mass transfer, rather than reaction, is the limiting process, as indicated by the bioavailability number. In such cases, reaction is slower than expected, and further increase in the reaction rate does not enhance mass depletion. Mass transfer rate limitations were shown to affect both dechlorination and microbial growth kinetics. The complex dynamics between mass transfer, DNAPL transport and distribution, and

  13. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  14. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  15. Detection limits of pollutants in water for PGNAA using Am-Be source

    International Nuclear Information System (INIS)

    Khelifi, R.; Amokrane, A.; Bode, P.

    2007-01-01

    A basic PGNAA facility with an Am-Be neutron source is described to analyze the pollutants in water. The properties of neutron flux were determined by MCNP calculations. In order to determine the efficiency curve of a HPGe detector, the prompt-gamma rays from chlorine were used and an exponential curve was fitted. The detection limits for typical water sample are also estimated using the statistical fluctuations of the background level in the areas of recorded the prompt-gamma spectrum

  16. Comparison of open source database systems(characteristics, limits of usage)

    OpenAIRE

    Husárik, Braňko

    2008-01-01

    The goal of this work is to compare some chosen open source database systems (Ingres, PostgreSQL, Firebird, Mysql). First part of work is focused on history and present situation of companies which are developing these products. Second part contains the comparision of certain group of specific features and limits. The benchmark of some operations is its own part. Possibilities of usage of mentioned database systems are summarized at the end of work.

  17. Carbon source from the toroidal pumped limiter during long discharge operation in Tore Supra

    International Nuclear Information System (INIS)

    Dufour, E.; Brosset, C.; Lowry, C.; Bucalossi, J.; Chappuis, P.; Corre, Y.; Desgranges, C.; Guirlet, R.; Gunn, J.; Loarer, T.; Mitteau, R.; Monier-Garbet, P.; Pegourie, B.; Reichle, R.; Thomas, P.; Tsitrone, E.; Hogan, J.; Roubin, P.; Martin, C.; Arnas, C.

    2005-01-01

    A better understanding of deuterium retention mechanisms requires the knowledge of carbon sources in Tore-Supra. The main source of carbon in the vacuum vessel during long discharges is the toroidal pumped limiter (TPL). This work is devoted to the experimental characterisation of the carbon source from the TPL surface during long discharges using a visible spectroscopy diagnostic. Moreover, we present an attempt to perform a carbon balance over a typical campaign and we discuss it with regards to the deuterium in-vessel inventory deduced from particle balance and the deuterium content of the deposited layers. The study shows that only a third of the estimated deuterium trapped in the vessel is trapped in the carbon deposits. Thus, in the present state of our knowledge and characterisation of the permanent retention, one has to search for mechanisms other than co-deposition to explain the deuterium retention in Tore Supra. (A.C.)

  18. Operational limit of a planar DC magnetron cluster source due to target erosion

    International Nuclear Information System (INIS)

    Rai, A.; Mutzke, A.; Bandelow, G.; Schneider, R.; Ganeva, M.; Pipa, A.V.; Hippler, R.

    2013-01-01

    The binary collision-based two dimensional SDTrimSP-2D model has been used to simulate the erosion process of a Cu target and its influence on the operational limit of a planar DC magnetron nanocluster source. The density of free metal atoms in the aggregation region influences the cluster formation and cluster intensity during the target lifetime. The density of the free metal atoms in the aggregation region can only be predicted by taking into account (i) the angular distribution of the sputtered flux from the primary target source and (ii) relative downwards shift of the primary source of sputtered atoms during the erosion process. It is shown that the flux of the sputtered atoms smoothly decreases with the target erosion

  19. Experimental investigation of thermal limits in parallel plate configuration for the Advanced Neutron Source Reactor

    International Nuclear Information System (INIS)

    Siman-Tov, M.; Felde, D.K.; Kaminaga, M.; Yoder, G.L.

    1993-01-01

    The Advanced Neutron Source Reactor (ANSR) is currently being designed to become the world's highest-flux, steady-state, thermal neutron source for scientific experiments. Highly subcooled, heavy-water coolant flows vertically upward at a very high velocity of 25 m/s through parallel aluminum fuel-plates. The core has average and peak heat fluxes of 5.9 and 12 MW/m 2 , respectively. In this configuration, both flow excursion (FE) and true critical heat flux (CHF), represent potential thermal limitations. The availability of experimental data for both FE and true CHF at the conditions applicable to the ANSR is very limited. A Thermal Hydraulic Test Loop (THTL) facility was designed and built to simulate a full-length coolant subchannel of the core, allowing experimental determination of both thermal limits under the expected ANSR T/H conditions. A series of FE tests with water flowing vertically upward was completed over a nominal heat flux range of 6 to 14 MW/m 2 and a corresponding velocity range of 8 to 21 m/s. Both the exit pressure (1.7 MPa) and inlet temperature (45 degrees C) were maintained constant for these tests, while the loop was operated in a ''stiff''(constant flow) mode. Limited experiments were also conducted at 12 MW/m 2 using a ''soft'' mode (near constant pressure-drop) for actual FE burnout tests and using a ''stiff' mode for true CHF tests, to compare with the original FE experiments

  20. 49 CFR Appendix B to Part 564 - Information To Be Submitted for Long Life Replaceable Light Sources of Limited Definition

    Science.gov (United States)

    2010-10-01

    ...—Information To Be Submitted for Long Life Replaceable Light Sources of Limited Definition I. Filament or... Source that Operates With a Ballast and Rated Life of the Light Source/Ballast Combination. A. Maximum power (in watts). B. Luminous Flux (in lumens). C. Rated laboratory life of the light source/ballast...

  1. Calibrate the aerial surveying instrument by the limited surface source and the single point source that replace the unlimited surface source

    International Nuclear Information System (INIS)

    Lu Cunheng

    1999-01-01

    It is described that the calculating formula and surveying result is found on the basis of the stacking principle of gamma ray and the feature of hexagonal surface source when the limited surface source replaces the unlimited surface source to calibrate the aerial survey instrument on the ground, and that it is found in the light of the exchanged principle of the gamma ray when the single point source replaces the unlimited surface source to calibrate aerial surveying instrument in the air. Meanwhile through the theoretical analysis, the receiving rate of the crystal bottom and side surfaces is calculated when aerial surveying instrument receives gamma ray. The mathematical expression of the gamma ray decaying following height according to the Jinge function regularity is got. According to this regularity, the absorbing coefficient that air absorbs the gamma ray and the detective efficiency coefficient of the crystal is calculated based on the ground and air measuring value of the bottom surface receiving count rate (derived from total receiving count rate of the bottom and side surface). Finally, according to the measuring value, it is proved that imitating the change of total receiving gamma ray exposure rate of the bottom and side surfaces with this regularity in a certain high area is feasible

  2. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  3. An uncertainty analysis of the hydrogen source term for a station blackout accident in Sequoyah using MELCOR 1.8.5

    Energy Technology Data Exchange (ETDEWEB)

    Gauntt, Randall O.; Bixler, Nathan E.; Wagner, Kenneth Charles.

    2014-03-01

    A methodology for using the MELCOR code with the Latin Hypercube Sampling method was developed to estimate uncertainty in various predicted quantities such as hydrogen generation or release of fission products under severe accident conditions. In this case, the emphasis was on estimating the range of hydrogen sources in station blackout conditions in the Sequoyah Ice Condenser plant, taking into account uncertainties in the modeled physics known to affect hydrogen generation. The method uses user-specified likelihood distributions for uncertain model parameters, which may include uncertainties of a stochastic nature, to produce a collection of code calculations, or realizations, characterizing the range of possible outcomes. Forty MELCOR code realizations of Sequoyah were conducted that included 10 uncertain parameters, producing a range of in-vessel hydrogen quantities. The range of total hydrogen produced was approximately 583kg 131kg. Sensitivity analyses revealed expected trends with respected to the parameters of greatest importance, however, considerable scatter in results when plotted against any of the uncertain parameters was observed, with no parameter manifesting dominant effects on hydrogen generation. It is concluded that, with respect to the physics parameters investigated, in order to further reduce predicted hydrogen uncertainty, it would be necessary to reduce all physics parameter uncertainties similarly, bearing in mind that some parameters are inherently uncertain within a range. It is suspected that some residual uncertainty associated with modeling complex, coupled and synergistic phenomena, is an inherent aspect of complex systems and cannot be reduced to point value estimates. The probabilistic analyses such as the one demonstrated in this work are important to properly characterize response of complex systems such as severe accident progression in nuclear power plants.

  4. Uncertainty and risk in wildland fire management: A review

    Science.gov (United States)

    Matthew P. Thompson; Dave E. Calkin

    2011-01-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to...

  5. Source limitation of carbon gas emissions in high-elevation mountain streams and lakes

    Science.gov (United States)

    Crawford, John T.; Dornblaser, Mark M.; Stanley, Emily H.; Clow, David W.; Striegl, Robert G.

    2015-01-01

    Inland waters are an important component of the global carbon cycle through transport, storage, and direct emissions of CO2 and CH4 to the atmosphere. Despite predictions of high physical gas exchange rates due to turbulent flows and ubiquitous supersaturation of CO2—and perhaps also CH4—patterns of gas emissions are essentially undocumented for high mountain ecosystems. Much like other headwater networks around the globe, we found that high-elevation streams in Rocky Mountain National Park, USA, were supersaturated with CO2 during the growing season and were net sources to the atmosphere. CO2concentrations in lakes, on the other hand, tended to be less than atmospheric equilibrium during the open water season. CO2 and CH4 emissions from the aquatic conduit were relatively small compared to many parts of the globe. Irrespective of the physical template for high gas exchange (high k), we found evidence of CO2 source limitation to mountain streams during the growing season, which limits overall CO2emissions. Our results suggest a reduced importance of aquatic ecosystems for carbon cycling in high-elevation landscapes having limited soil development and high CO2 consumption via mineral weathering.

  6. Approximate source conditions for nonlinear ill-posed problems—chances and limitations

    International Nuclear Information System (INIS)

    Hein, Torsten; Hofmann, Bernd

    2009-01-01

    In the recent past the authors, with collaborators, have published convergence rate results for regularized solutions of linear ill-posed operator equations by avoiding the usual assumption that the solutions satisfy prescribed source conditions. Instead the degree of violation of such source conditions is expressed by distance functions d(R) depending on a radius R ≥ 0 which is an upper bound of the norm of source elements under consideration. If d(R) tends to zero as R → ∞ an appropriate balancing of occurring regularization error terms yields convergence rates results. This approach was called the method of approximate source conditions, originally developed in a Hilbert space setting. The goal of this paper is to formulate chances and limitations of an application of this method to nonlinear ill-posed problems in reflexive Banach spaces and to complement the field of low order convergence rates results in nonlinear regularization theory. In particular, we are going to establish convergence rates for a variant of Tikhonov regularization. To keep structural nonlinearity conditions simple, we update the concept of degree of nonlinearity in Hilbert spaces to a Bregman distance setting in Banach spaces

  7. Detection limits for real-time source water monitoring using indigenous freshwater microalgae

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Jr, Miguel [ORNL; Greenbaum, Elias [ORNL

    2009-01-01

    This research identified toxin detection limits using the variable fluorescence of naturally occurring microalgae in source drinking water for five chemical toxins with different molecular structures and modes of toxicity. The five chemicals investigated were atrazine, Diuron, paraquat, methyl parathion, and potassium cyanide. Absolute threshold sensitivities of the algae for detection of the toxins in unmodified source drinking water were measured. Differential kinetics between the rate of action of the toxins and natural changes in algal physiology, such as diurnal photoinhibition, are significant enough that effects of the toxin can be detected and distinguished from the natural variance. This is true even for physiologically impaired algae where diminished photosynthetic capacity may arise from uncontrollable external factors such as nutrient starvation. Photoinhibition induced by high levels of solar radiation is a predictable and reversible phenomenon that can be dealt with using a period of dark adaption of 30 minutes or more.

  8. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  9. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    Science.gov (United States)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  10. Preliminary limits on the flux of muon neutrinos from extraterrestrial point sources

    International Nuclear Information System (INIS)

    Bionta, R.M.; Blewitt, G.; Bratton, C.B.

    1985-01-01

    We present the arrival directions of 117 upward-going muon events collected with the IMB proton lifetime detector during 317 days of live detector operation. The rate of upward-going muons observed in our detector was found to be consistent with the rate expected from atmospheric neutrino production. The upper limit on the total flux of extraterrestrial neutrinos >1 GeV is 2 -sec. Using our data and a Monte Carlo simulation of high energy muon production in the earth surrounding the detector, we place limits on the flux of neutrinos from a point source in the Vela X-2 system of 2 -sec with E > 1 GeV. 6 refs., 5 figs

  11. Best estimate plus uncertainty analysis of departure from nucleate boiling limiting case with CASL core simulator VERA-CS in response to PWR main steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Brown, C.S., E-mail: csbrown3@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, 2500 Stinson Drive, Raleigh, NC 27695-7909 (United States); Zhang, H., E-mail: Hongbin.Zhang@inl.gov [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3870 (United States); Kucukboyaci, V., E-mail: kucukbvn@westinghouse.com [Westinghouse Electric Company, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States); Sung, Y., E-mail: sungy@westinghouse.com [Westinghouse Electric Company, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States)

    2016-12-01

    Highlights: • Best estimate plus uncertainty (BEPU) analyses of PWR core responses under main steam line break (MSLB) accident. • CASL’s coupled neutron transport/subchannel code VERA-CS. • Wilks’ nonparametric statistical method. • MDNBR 95/95 tolerance limit. - Abstract: VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was applied to simulate core behavior of a typical Westinghouse-designed 4-loop pressurized water reactor (PWR) with 17 × 17 fuel assemblies in response to two main steam line break (MSLB) accident scenarios initiated at hot zero power (HZP) at the end of the first fuel cycle with the most reactive rod cluster control assembly stuck out of the core. The reactor core boundary conditions at the most DNB limiting time step were determined by a system analysis code. The core inlet flow and temperature distributions were obtained from computational fluid dynamics (CFD) simulations. The two MSLB scenarios consisted of the high and low flow situations, where reactor coolant pumps either continue to operate with offsite power or do not continue to operate since offsite power is unavailable. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this demonstration of BEPU application, 59 full core simulations were performed for each accident scenario to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. A parametric goodness-of-fit approach was also applied to the results to obtain the MDNBR value at the 95/95 tolerance limit. Initial sensitivity analysis was performed with the 59 cases per accident scenario by use of Pearson correlation coefficients. The results show that this typical PWR core

  12. Best estimate plus uncertainty analysis of departure from nucleate boiling limiting case with CASL core simulator VERA-CS in response to PWR main steam line break event

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, H.; Kucukboyaci, V.; Sung, Y.

    2016-01-01

    Highlights: • Best estimate plus uncertainty (BEPU) analyses of PWR core responses under main steam line break (MSLB) accident. • CASL’s coupled neutron transport/subchannel code VERA-CS. • Wilks’ nonparametric statistical method. • MDNBR 95/95 tolerance limit. - Abstract: VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was applied to simulate core behavior of a typical Westinghouse-designed 4-loop pressurized water reactor (PWR) with 17 × 17 fuel assemblies in response to two main steam line break (MSLB) accident scenarios initiated at hot zero power (HZP) at the end of the first fuel cycle with the most reactive rod cluster control assembly stuck out of the core. The reactor core boundary conditions at the most DNB limiting time step were determined by a system analysis code. The core inlet flow and temperature distributions were obtained from computational fluid dynamics (CFD) simulations. The two MSLB scenarios consisted of the high and low flow situations, where reactor coolant pumps either continue to operate with offsite power or do not continue to operate since offsite power is unavailable. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this demonstration of BEPU application, 59 full core simulations were performed for each accident scenario to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. A parametric goodness-of-fit approach was also applied to the results to obtain the MDNBR value at the 95/95 tolerance limit. Initial sensitivity analysis was performed with the 59 cases per accident scenario by use of Pearson correlation coefficients. The results show that this typical PWR core

  13. Force Limited Vibration Testing: Computation C2 for Real Load and Probabilistic Source

    Science.gov (United States)

    Wijker, J. J.; de Boer, A.; Ellenbroek, M. H. M.

    2014-06-01

    To prevent over-testing of the test-item during random vibration testing Scharton proposed and discussed the force limited random vibration testing (FLVT) in a number of publications, in which the factor C2 is besides the random vibration specification, the total mass and the turnover frequency of the load(test item), a very important parameter. A number of computational methods to estimate C2 are described in the literature, i.e. the simple and the complex two degrees of freedom system, STDFS and CTDFS, respectively. Both the STDFS and the CTDFS describe in a very reduced (simplified) manner the load and the source (adjacent structure to test item transferring the excitation forces, i.e. spacecraft supporting an instrument).The motivation of this work is to establish a method for the computation of a realistic value of C2 to perform a representative random vibration test based on force limitation, when the adjacent structure (source) description is more or less unknown. Marchand formulated a conservative estimation of C2 based on maximum modal effective mass and damping of the test item (load) , when no description of the supporting structure (source) is available [13].Marchand discussed the formal description of getting C 2 , using the maximum PSD of the acceleration and maximum PSD of the force, both at the interface between load and source, in combination with the apparent mass and total mass of the the load. This method is very convenient to compute the factor C 2 . However, finite element models are needed to compute the spectra of the PSD of both the acceleration and force at the interface between load and source.Stevens presented the coupled systems modal approach (CSMA), where simplified asparagus patch models (parallel-oscillator representation) of load and source are connected, consisting of modal effective masses and the spring stiffnesses associated with the natural frequencies. When the random acceleration vibration specification is given the CMSA

  14. Potential of vehicle-to-grid ancillary services considering the uncertainties in plug-in electric vehicle availability and service/localization limitations in distribution grids

    International Nuclear Information System (INIS)

    Sarabi, Siyamak; Davigny, Arnaud; Courtecuisse, Vincent; Riffonneau, Yann; Robyns, Benoît

    2016-01-01

    Highlights: • The availability uncertainty of PEVs are modelled using Gaussian mixture model. • Interdependency of stochastic variables are modelled using copula function. • V2G bidding capacity is calculated using Free Pattern search optimization method. • Localization limitation is considered for V2G service potential assessment. • Competitive services for fleet of V2G-enabled PEVs are identified using fuzzy sets. - Abstract: The aim of the paper is to propose an approach for statistical assessment of the potential of plug-in electric vehicles (PEV) for vehicle-to-grid (V2G) ancillary services, where it focuses on PEVs doing daily home-work commuting. In this approach, the possible ancillary services (A/S) for each PEV fleet in terms of its available V2G power (AVP) and flexible intervals are identified. The flexible interval is calculated using a powerful stochastic global optimization technique so-called “Free Pattern Search” (FPS). A probabilistic method is also proposed to quantify the impacts of PEV’s availability uncertainty using the Gaussian mixture model (GMM), and interdependency of stochastic variables on AVP of each fleet thanks to a multivariate modeling with Copula function. Each fleet is analyzed based on its aggregated PEV numbers at different level of distribution grid, in order to satisfy the ancillary services localization limitation. A case study using the proposed approach evaluates the real potential in Niort, a city in west of France. In fact, by using the proposed approach an aggregator can analyze the V2G potential of PEVs under its contract.

  15. Solar, wind and waves: Natural limits to renewable sources of energy within the Earth system

    Energy Technology Data Exchange (ETDEWEB)

    Kleidon, Axel [Max-Planck-Institute for Biogeochemistry, Jena (Germany)

    2013-07-01

    Renewable sources of energy, such as solar, wind, wave, or hydropower, utilize energy that is continuously generated by natural processes within the Earth system from the planetary forcing. Here we estimate the limits of these natural energy conversions and the extent to which these can be used as renewable energy sources using the laws of thermodynamics. At most, wind power in the order of 1 000 TW (1 TW = 1E12 W) can be derived from the total flux of incoming solar radiation of 175 000 TW, which is consistent with estimates based on observations. Other generation rates that are derived from the kinetic energy of wind are in the order of 10-100 TW. In comparison, the human primary energy demand of about 17 TW constitutes a considerable fraction of these rates. We provide some further analysis on the limits of wind power using a combination of conceptual models, observational data, and numerical simulation models. We find that many current estimates of wind power substantially overestimate the potential of wind power because the effect of kinetic energy extraction on the air flow is neglected. We conclude that the only form of renewable energy that is available in substantial amounts and that is associated with minor climatic impacts is solar power.

  16. Time-limited effects of emotional arousal on item and source memory.

    Science.gov (United States)

    Wang, Bo; Sun, Bukuan

    2015-01-01

    Two experiments investigated the time-limited effects of emotional arousal on consolidation of item and source memory. In Experiment 1, participants memorized words (items) and the corresponding speakers (sources) and then took an immediate free recall test. Then they watched a neutral, positive, or negative video 5, 35, or 50 min after learning, and 24 hours later they took surprise memory tests. Experiment 2 was similar to Experiment 1 except that (a) a reality monitoring task was used; (b) elicitation delays of 5, 30, and 45 min were used; and (c) delayed memory tests were given 60 min after learning. Both experiments showed that, regardless of elicitation delay, emotional arousal did not enhance item recall memory. Second, both experiments showed that negative arousal enhanced delayed item recognition memory only at the medium elicitation delay, but not in the shorter or longer delays. Positive arousal enhanced performance only in Experiment 1. Third, regardless of elicitation delay, emotional arousal had little effect on source memory. These findings have implications for theories of emotion and memory, suggesting that emotion effects are contingent upon the nature of the memory task and elicitation delay.

  17. A multicenter study to quantify systematic variations and associated uncertainties in source positioning with commonly used HDR afterloaders and ring applicators for the treatment of cervical carcinomas

    Energy Technology Data Exchange (ETDEWEB)

    Awunor, O., E-mail: onuora.awunor@stees.nhs.uk [The Medical Physics Department, The James Cook University Hospital, Marton Road, Middlesbrough TS4 3BW, England (United Kingdom); Berger, D. [Department of Radiotherapy, General Hospital of Vienna, Vienna A-1090 (Austria); Kirisits, C. [Department of Radiotherapy, Comprehensive Cancer Center, Medical University of Vienna, Vienna A-1090 (Austria)

    2015-08-15

    Purpose: The reconstruction of radiation source position in the treatment planning system is a key part of the applicator reconstruction process in high dose rate (HDR) brachytherapy treatment of cervical carcinomas. The steep dose gradients, of as much as 12%/mm, associated with typical cervix treatments emphasize the importance of accurate and precise determination of source positions. However, a variety of methodologies with a range in associated measurement uncertainties, of up to ±2.5 mm, are currently employed by various centers to do this. In addition, a recent pilot study by Awunor et al. [“Direct reconstruction and associated uncertainties of {sup 192}Ir source dwell positions in ring applicators using gafchromic film in the treatment planning of HDR brachytherapy cervix patients,” Phys. Med. Biol. 58, 3207–3225 (2013)] reported source positional differences of up to 2.6 mm between ring sets of the same type and geometry. This suggests a need for a comprehensive study to assess and quantify systematic source position variations between commonly used ring applicators and HDR afterloaders across multiple centers. Methods: Eighty-six rings from 20 European brachytherapy centers were audited in the form of a postal audit with each center collecting the data independently. The data were collected by setting up the rings using a bespoke jig and irradiating gafchromic films at predetermined dwell positions using four afterloader types, MicroSelectron, Flexitron, GammaMed, and MultiSource, from three manufacturers, Nucletron, Varian, and Eckert & Ziegler BEBIG. Five different ring types in six sizes (Ø25–Ø35 mm) and two angles (45° and 60°) were used. Coordinates of irradiated positions relative to the ring center were determined and collated, and source position differences quantified by ring type, size, and angle. Results: The mean expanded measurement uncertainty (k = 2) along the direction of source travel was ±1.4 mm. The standard deviation

  18. Systematic uncertainties in direct reaction theories

    International Nuclear Information System (INIS)

    Lovell, A E; Nunes, F M

    2015-01-01

    Nuclear reactions are common probes to study nuclei and in particular, nuclei at the limits of stability. The data from reaction measurements depend strongly on theory for a reliable interpretation. Even when using state-of-the-art reaction theories, there are a number of sources of systematic uncertainties. These uncertainties are often unquantified or estimated in a very crude manner. It is clear that for theory to be useful, a quantitative understanding of the uncertainties is critical. Here, we discuss major sources of uncertainties in a variety of reaction theories used to analyze (d,p) nuclear reactions in the energy range E d = 10–20 MeV, and we provide a critical view on how these have been handled in the past and how estimates can be improved. (paper)

  19. How phosphorus limitation can control climate-active gas sources and sinks

    Science.gov (United States)

    Gypens, Nathalie; Borges, Alberto V.; Ghyoot, Caroline

    2017-06-01

    Since the 1950's, anthropogenic activities have increased nutrient river loads to European coastal areas. Subsequent implementation of nutrient reduction policies have led to considerably reduction of phosphorus (P) loads from the mid-1980's, while nitrogen (N) loads were maintained, inducing a P limitation of phytoplankton growth in many eutrophied coastal areas such as the Southern Bight of the North Sea (SBNS). When dissolved inorganic phosphorus (DIP) is limiting, most phytoplankton organisms are able to indirectly acquire P from dissolved organic P (DOP). We investigate the impact of DOP use on phytoplankton production and atmospheric fluxes of CO2 and dimethylsulfide (DMS) in the SBNS from 1951 to 2007 using an extended version of the R-MIRO-BIOGAS model. This model includes a description of the ability of phytoplankton organisms to use DOP as a source of P. Results show that primary production can increase up to 30% due to DOP uptake under limiting DIP conditions. Consequently, simulated DMS emissions also increase proportionally while CO2 emissions to the atmosphere decrease, relative to the reference simulation without DOP uptake.

  20. From the feasibility assessment to the licensing application: organisation of the data acquisition, how to deal with metrological limits, uncertainties and project milestones; how far must we go?

    International Nuclear Information System (INIS)

    Landais, P.; Labalette, T.

    2009-01-01

    The research work summarised in the Dossier 2005 Argile has provided detailed information on each of the repository components but also on the determination, the analysis and the assessment of the main phenomena which are occurring within the repository. Their detailed representation associated with proposed repository architectures allowed the processing of the data in order to assess the robustness of the repository and to see how it would meet safety requirements. Through various indicators, the analysis showed that the three main safety functions preventing water circulation, limiting radionuclides release and immobilizing them in the repository and delaying and attenuating radionuclide migration were effectively fulfilled by the proposed system in both normal and much more penalizing situations. The PARS as well as the QSA already facilitated a systematic identification of uncertainties, then allowing covering them either trough cautious hypothesis, penalizing or conservative representation of some phenomena or components, sensitivity studies or altered evolution scenarios. Subsequently, the safety analysis revealed some residual uncertainties and margins for potential progress which will provide useful orientations for future research developments. While the safety analysis conducted reveals that the repository appears to be robust in all the configurations envisaged with respect to its safety functions, both CNE and safety authority evaluations focus on the necessity to provide more comprehensive and realistic modelling of the behavior of the repository (both exploitation and post-closure periods) and of the radionuclides. For example, it is requested not to consider the perturbed zone (EDZ) as a dead zone the characteristics and properties of which are set to zero in terms of transport. Similarly, when ANDRA constructed its safety case, an envelope hypothesis for conducting calculations led to consider the repository as fully saturated as soon as its

  1. Quantification of uncertainty in photon source spot size inference during laser-driven radiography experiments at TRIDENT

    Energy Technology Data Exchange (ETDEWEB)

    Tobias, Benjamin John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Palaniyappan, Sasikumar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gautier, Donald Cort [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mendez, Jacob [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Burris-Mog, Trevor John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Huang, Chengkun K. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Favalli, Andrea [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hunter, James F. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Espy, Michelle E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Schmidt, Derek William [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Nelson, Ronald Owen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sefkow, Adam [Univ. of Rochester, NY (United States); Shimada, Tsutomu [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Johnson, Randall Philip [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Fernandez, Juan Carlos [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-24

    Images of the R2DTO resolution target were obtained during laser-driven-radiography experiments performed at the TRIDENT laser facility, and analysis of these images using the Bayesian Inference Engine (BIE) determines a most probable full-width half maximum (FWHM) spot size of 78 μm. However, significant uncertainty prevails due to variation in the measured detector blur. Propagating this uncertainty in detector blur through the forward model results in an interval of probabilistic ambiguity spanning approximately 35-195 μm when the laser energy impinges on a thick (1 mm) tantalum target. In other phases of the experiment, laser energy is deposited on a thin (~100 nm) aluminum target placed 250 μm ahead of the tantalum converter. When the energetic electron beam is generated in this manner, upstream from the bremsstrahlung converter, the inferred spot size shifts to a range of much larger values, approximately 270-600 μm FWHM. This report discusses methods applied to obtain these intervals as well as concepts necessary for interpreting the result within a context of probabilistic quantitative inference.

  2. SWEPT-SOURCE OPTICAL COHERENCE TOMOGRAPHY ANGIOGRAPHY REVEALS INTERNAL LIMITING MEMBRANE PEELING ALTERS DEEP RETINAL VASCULATURE.

    Science.gov (United States)

    Michalewska, Zofia; Nawrocki, Jerzy

    2018-04-30

    To describe morphology of retinal and choroidal vessels in swept-source optical coherence tomography angiography before and after vitrectomy with the temporal inverted internal limiting membrane (ILM) flap technique for full-thickness macular holes. Prospective, observational study of 36 eyes of 33 patients with full-thickness macular holes swept-source optical coherence tomography angiography was performed in patients before and 1 month after vitrectomy. Vitrectomy with the temporal inverted ILM flap technique was performed. In this method, ILM is peeled only at one side of the fovea. An ILM flap is created to cover the macular hole. Comparison of retina vasculature in the areas of ILM peeling vs. no ILM peeling at 1 and 3 months after successful vitrectomy was performed. The study demonstrated lower density of vessels in the deep retinal plexus in the area where ILM was peeled as compared to the rest of the fovea. Visual acuity and central retinal thickness 1 month after surgery correlates with fovea avascular zone diameter in deep retinal layers at the same time point (P = 0.001). This study confirmed that ILM peeling might alter blood flow in deep retinal vessels below the peeling area in the early postoperative period. The area of the fovea avascular zone corresponds to functional results at the same time point.

  3. Direct reconstruction and associated uncertainties of 192Ir source dwell positions in ring applicators using gafchromic film in the treatment planning of HDR brachytherapy cervix patients

    Science.gov (United States)

    Awunor, O. A.; Dixon, B.; Walker, C.

    2013-05-01

    This paper details a practical method for the direct reconstruction of high dose rate 192Ir source dwell positions in ring applicators using gafchromic film in the treatment planning of brachytherapy cervix patients. It also details the uncertainties associated with such a process. Eight Nucletron interstitial ring applicators—Ø26 mm (×4), Ø30 mm (×3) and Ø34 mm (×1), and one 60 mm intrauterine tube were used in this study. RTQA2 and XRQA2 gafchromic films were irradiated at pre-programmed dwell positions with three successive 192Ir sources and used to derive the coordinates of the source dwell positions. The source was observed to deviate significantly from its expected position by up to 6.1 mm in all ring sizes. Significant inter applicator differences of up to 2.6 mm were observed between a subset of ring applicators. Also, the measured data were observed to differ significantly from commercially available source path models provided by Nucletron with differences of up to 3.7 mm across all ring applicator sizes. The total expanded uncertainty (k = 2) averaged over all measured dwell positions in the rings was observed to be 1.1 ± 0.1 mm (Ø26 mm and Ø30 mm rings) and 1.0 ± 0.3 mm (Ø34 mm ring) respectively, and when transferred to the treatment planning system, equated to maximum %dose changes of 1.9%, 13.2% and 1.5% at regions representative of the parametrium, lateral fornix and organs at risk respectively.

  4. The challenges of modelling phosphorus in a headwater catchment: Applying a 'limits of acceptability' uncertainty framework to a water quality model

    Science.gov (United States)

    Hollaway, M. J.; Beven, K. J.; Benskin, C. McW. H.; Collins, A. L.; Evans, R.; Falloon, P. D.; Forber, K. J.; Hiscock, K. M.; Kahana, R.; Macleod, C. J. A.; Ockenden, M. C.; Villamizar, M. L.; Wearing, C.; Withers, P. J. A.; Zhou, J. G.; Barber, N. J.; Haygarth, P. M.

    2018-03-01

    There is a need to model and predict the transfer of phosphorus (P) from land to water, but this is challenging because of the large number of complex physical and biogeochemical processes involved. This study presents, for the first time, a 'limits of acceptability' approach of the Generalized Likelihood Uncertainty Estimation (GLUE) framework to the Soil and Water Assessment Tool (SWAT), in an application to a water quality problem in the Newby Beck catchment (12.5 km2), Cumbria, United Kingdom (UK). Using high frequency outlet data (discharge and P), individual evaluation criteria (limits of acceptability) were assigned to observed discharge and P loads for all evaluation time steps, identifying where the model was performing well/poorly and to infer which processes required improvement in the model structure. Initial limits of acceptability were required to be relaxed by a substantial amount (by factors of between 5.3 and 6.7 on a normalized scale depending on the evaluation criteria used) in order to gain a set of behavioral simulations (1001 and 1016, respectively out of 5,000,000). Of the 39 model parameters tested, the representation of subsurface processes and associated parameters, were consistently shown as critical to the model not meeting the evaluation criteria, irrespective of the chosen evaluation metric. It is therefore concluded that SWAT is not an appropriate model to guide P management in this catchment. This approach highlights the importance of high frequency monitoring data for setting robust model evaluation criteria. It also raises the question as to whether it is possible to have sufficient input data available to drive such models so that we can have confidence in their predictions and their ability to inform catchment management strategies to tackle the problem of diffuse pollution from agriculture.

  5. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  6. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  7. Evaluation of uncertainty and detection limits in 210Pb and 210Po measurement in water by alpha spectrometry using 210Po spontaneous deposition onto a silver disk

    International Nuclear Information System (INIS)

    Fernández, Pedro L.; Gómez, José; Ródenas, Carmen

    2012-01-01

    An easy and accurate method for the determination of 210 Pb and 210 Po in water using 210 Po spontaneous deposition onto a silver disk is proposed and assessed for its detection capabilities according to the ISO Guide for the expression of uncertainty in measurement (GUM) and ISO Standard 11929-7 concerning the evaluation of the characteristic limits for ionizing radiation measurements. The method makes no assumption on the initial values of the activity concentrations of 210 Pb, 210 Bi and 210 Po in the sample to be analyzed, and is based on the alpha spectrometric measurement of 210 Po in two different aliquots: the first one measured five weeks after the sampling date to ensure radioactive equilibrium between 210 Pb and 210 Bi and the second after a sufficient time for the ingrowth of 210 Po from 210 Pb to be significant. As shown, for a recommended time interval of seven months between 210 Po measurements, the applicability of the proposed method is limited to water samples with a 226 Ra to 210 Pb activity ratio C Ra /C Pb ≤4, as usual in natural waters. Using sample and background counting times of 24 h and 240 h, respectively, the detection limit of the activity concentration of each radionuclide at the sampling time for a 1 L sample typically varies between 0.7 and 16 mBq L −1 for 210 Pb in water samples with an initial activity of 210 Po in the range 0–200 mBq L −1 , and between 0.6 and 8.5 mBq L −1 for 210 Po in water samples with an initial activity of 210 Pb in the same range. - Highlights: ► 210 Pb and 210 Po measurement in water by 210 Po spontaneous deposition onto silver disks. ► 210 Pb and 210 Po determination based on 210 Po measurement in two different aliquots. ► Evaluation of characteristic limits in radioactivity measurements using ISO 11929-7. ► 10 Pb - 210 Po detection limits decrease with time elapsed between 210 Po measurements.

  8. Source contributions to carbonaceous species in PM2.5 and their uncertainty analysis at typical urban, peri-urban and background sites in southeast China

    International Nuclear Information System (INIS)

    Niu, Zhenchuan; Wang, Sen; Chen, Jinsheng; Zhang, Fuwang; Chen, Xiaoqiu; He, Chi; Lin, Lifeng; Yin, Liqian; Xu, Lingling

    2013-01-01

    Determination of 14 C and levoglucosan can provide insights into the quantification of source contributions to carbonaceous aerosols, yet there is still uncertainty on the partitioning of organic carbon (OC) into biomass burning OC (OC bb ) and biogenic emission OC (OC bio ). Carbonaceous species, levoglucosan and 14 C in PM 2.5 were measured at three types of site in southeast China combined with Latin hypercube sampling, with the objectives to study source contributions to total carbon (TC) and their uncertainties, and to evaluate the influence of levoglucosan/OC bb ratios on OC bb and OC bio partitioning. It was found reliably that fossil fuel combustion is the main contributor (62.90–72.23%) to TC at urban and peri-urban sites. Biogenic emissions have important contribution (winter, 52.98%; summer, 45.71%) to TC at background site. With the increase in levoglucosan/OC bb ratios, the contribution of OC bio is increased while OC bb is decreased in a pattern of approximate natural logarithm at a given range. -- Highlights: •Source contributions to OC and EC were quantified by levoglucosan and 14 C. •Fossil fuel combustion is the main contributor to TC for urban and peri-urban sites. •Biogenic emissions have important contribution to TC for the background site. •Biomass burning is a minor contributor to TC and has high contribution in winter. •Ratios of OC bio and OC bb to TC have a natural logarithmic relation with lev/OC bb . -- The contributions of OC bio and OC bb to TC have a natural logarithmic relationship with the levoglucosan/OC bb ratios

  9. The incidence of the different sources of noise on the uncertainty in radiochromic film dosimetry using single channel and multichannel methods

    Science.gov (United States)

    González-López, Antonio; Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen

    2017-11-01

    The influence of the various sources of noise on the uncertainty in radiochromic film (RCF) dosimetry using single channel and multichannel methods is investigated in this work. These sources of noise are extracted from pixel value (PV) readings and dose maps. Pieces of an RCF were each irradiated to different uniform doses, ranging from 0 to 1092 cGy. Then, the pieces were read at two resolutions (72 and 150 ppp) with two flatbed scanners: Epson 10000XL and Epson V800, representing two states of technology. Noise was extracted as described in ISO 15739 (2013), separating its distinct constituents: random noise and fixed pattern (FP) noise. Regarding the PV maps, FP noise is the main source of noise for both models of digitizer. Also, the standard deviation of the random noise in the 10000XL model is almost twice that of the V800 model. In the dose maps, the FP noise is smaller in the multichannel method than in the single channel ones. However, random noise is higher in this method, throughout the dose range. In the multichannel method, FP noise is reduced, as a consequence of this method’s ability to eliminate channel independent perturbations. However, the random noise increases, because the dose is calculated as a linear combination of the doses obtained by the single channel methods. The values of the coefficients of this linear combination are obtained in the present study, and the root of the sum of their squares is shown to range between 0.9 and 1.9 over the dose range studied. These results indicate the random noise to play a fundamental role in the uncertainty of RCF dosimetry: low levels of random noise are required in the digitizer to fully exploit the advantages of the multichannel dosimetry method. This is particularly important for measuring high doses at high spatial resolutions.

  10. Remote sensing of spring phenology in northeastern forests: A comparison of methods, field metrics and sources of uncertainty

    Science.gov (United States)

    Katharine White; Jennifer Pontius; Paul Schaberg

    2014-01-01

    Current remote sensing studies of phenology have been limited to coarse spatial or temporal resolution and often lack a direct link to field measurements. To address this gap, we compared remote sensing methodologies using Landsat Thematic Mapper (TM) imagery to extensive field measurements in a mixed northern hardwood forest. Five vegetation indices, five mathematical...

  11. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  12. Quality assurance of nuclear analytical techniques based on Bayesian characteristic limits

    International Nuclear Information System (INIS)

    Michel, R.

    2000-01-01

    Based on Bayesian statistics, characteristic limits such as decision threshold, detection limit and confidence limits can be calculated taking into account all sources of experimental uncertainties. This approach separates the complete evaluation of a measurement according to the ISO Guide to the Expression of Uncertainty in Measurement from the determination of the characteristic limits. Using the principle of maximum entropy the characteristic limits are determined from the complete standard uncertainty of the measurand. (author)

  13. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  14. Hotspots of gross emissions from the land use sector: patterns, uncertainties, and leading emission sources for the period 2000-2005 in the tropics

    Science.gov (United States)

    Roman-Cuesta, Rosa Maria; Rufino, Mariana C.; Herold, Martin; Butterbach-Bahl, Klaus; Rosenstock, Todd S.; Herrero, Mario; Ogle, Stephen; Li, Changsheng; Poulter, Benjamin; Verchot, Louis; Martius, Christopher; Stuiver, John; de Bruin, Sytze

    2016-07-01

    According to the latest report of the Intergovernmental Panel on Climate Change (IPCC), emissions must be cut by 41-72 % below 2010 levels by 2050 for a likely chance of containing the global mean temperature increase to 2 °C. The AFOLU sector (Agriculture, Forestry and Other Land Use) contributes roughly a quarter ( ˜ 10-12 Pg CO2e yr-1) of the net anthropogenic GHG emissions mainly from deforestation, fire, wood harvesting, and agricultural emissions including croplands, paddy rice, and livestock. In spite of the importance of this sector, it is unclear where the regions with hotspots of AFOLU emissions are and how uncertain these emissions are. Here we present a novel, spatially comparable dataset containing annual mean estimates of gross AFOLU emissions (CO2, CH4, N2O), associated uncertainties, and leading emission sources, in a spatially disaggregated manner (0.5°) for the tropics for the period 2000-2005. Our data highlight the following: (i) the existence of AFOLU emissions hotspots on all continents, with particular importance of evergreen rainforest deforestation in Central and South America, fire in dry forests in Africa, and both peatland emissions and agriculture in Asia; (ii) a predominant contribution of forests and CO2 to the total AFOLU emissions (69 %) and to their uncertainties (98 %); (iii) higher gross fluxes from forests, which coincide with higher uncertainties, making agricultural hotspots appealing for effective mitigation action; and (iv) a lower contribution of non-CO2 agricultural emissions to the total gross emissions (ca. 25 %), with livestock (15.5 %) and rice (7 %) leading the emissions. Gross AFOLU tropical emissions of 8.0 (5.5-12.2) were in the range of other databases (8.4 and 8.0 Pg CO2e yr-1 in FAOSTAT and the Emissions Database for Global Atmospheric Research (EDGAR) respectively), but we offer a spatially detailed benchmark for monitoring progress in reducing emissions from the land sector in the tropics. The location of

  15. GBIS (Geodetic Bayesian Inversion Software): Rapid Inversion of InSAR and GNSS Data to Estimate Surface Deformation Source Parameters and Uncertainties

    Science.gov (United States)

    Bagnardi, M.; Hooper, A. J.

    2017-12-01

    Inversions of geodetic observational data, such as Interferometric Synthetic Aperture Radar (InSAR) and Global Navigation Satellite System (GNSS) measurements, are often performed to obtain information about the source of surface displacements. Inverse problem theory has been applied to study magmatic processes, the earthquake cycle, and other phenomena that cause deformation of the Earth's interior and of its surface. Together with increasing improvements in data resolution, both spatial and temporal, new satellite missions (e.g., European Commission's Sentinel-1 satellites) are providing the unprecedented opportunity to access space-geodetic data within hours from their acquisition. To truly take advantage of these opportunities we must become able to interpret geodetic data in a rapid and robust manner. Here we present the open-source Geodetic Bayesian Inversion Software (GBIS; available for download at http://comet.nerc.ac.uk/gbis). GBIS is written in Matlab and offers a series of user-friendly and interactive pre- and post-processing tools. For example, an interactive function has been developed to estimate the characteristics of noise in InSAR data by calculating the experimental semi-variogram. The inversion software uses a Markov-chain Monte Carlo algorithm, incorporating the Metropolis-Hastings algorithm with adaptive step size, to efficiently sample the posterior probability distribution of the different source parameters. The probabilistic Bayesian approach allows the user to retrieve estimates of the optimal (best-fitting) deformation source parameters together with the associated uncertainties produced by errors in the data (and by scaling, errors in the model). The current version of GBIS (V1.0) includes fast analytical forward models for magmatic sources of different geometry (e.g., point source, finite spherical source, prolate spheroid source, penny-shaped sill-like source, and dipping-dike with uniform opening) and for dipping faults with uniform

  16. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  17. Fire Danger of Interaction Processes of Local Sources with a Limited Energy Capacity and Condensed Substances

    OpenAIRE

    Glushkov, Dmitry Olegovich; Strizhak, Pavel Alexandrovich; Vershinina, Kseniya Yurievna

    2015-01-01

    Numerical investigation of flammable interaction processes of local energy sources with liquid condensed substances has been carried out. Basic integrated characteristic values of process have been defined – ignition delay time at different energy sources parameters. Recommendations have been formulated to ensure fire safety of technological processes, characterized by possible local heat sources formation (cutting, welding, friction, metal grinding etc.) in the vicinity of storage areas, tra...

  18. Fire Danger of Interaction Processes of Local Sources with a Limited Energy Capacity and Condensed Substances

    Directory of Open Access Journals (Sweden)

    Glushkov Dmitrii O.

    2015-01-01

    Full Text Available Numerical investigation of flammable interaction processes of local energy sources with liquid condensed substances has been carried out. Basic integrated characteristic values of process have been defined – ignition delay time at different energy sources parameters. Recommendations have been formulated to ensure fire safety of technological processes, characterized by possible local heat sources formation (cutting, welding, friction, metal grinding etc. in the vicinity of storage areas, transportation, transfer and processing of flammable liquids (gasoline, kerosene, diesel fuel.

  19. Uncertainty in biological monitoring: a framework for data collection and analysis to account for multiple sources of sampling bias

    Science.gov (United States)

    Ruiz-Gutierrez, Viviana; Hooten, Melvin B.; Campbell Grant, Evan H.

    2016-01-01

    Biological monitoring programmes are increasingly relying upon large volumes of citizen-science data to improve the scope and spatial coverage of information, challenging the scientific community to develop design and model-based approaches to improve inference.Recent statistical models in ecology have been developed to accommodate false-negative errors, although current work points to false-positive errors as equally important sources of bias. This is of particular concern for the success of any monitoring programme given that rates as small as 3% could lead to the overestimation of the occurrence of rare events by as much as 50%, and even small false-positive rates can severely bias estimates of occurrence dynamics.We present an integrated, computationally efficient Bayesian hierarchical model to correct for false-positive and false-negative errors in detection/non-detection data. Our model combines independent, auxiliary data sources with field observations to improve the estimation of false-positive rates, when a subset of field observations cannot be validated a posteriori or assumed as perfect. We evaluated the performance of the model across a range of occurrence rates, false-positive and false-negative errors, and quantity of auxiliary data.The model performed well under all simulated scenarios, and we were able to identify critical auxiliary data characteristics which resulted in improved inference. We applied our false-positive model to a large-scale, citizen-science monitoring programme for anurans in the north-eastern United States, using auxiliary data from an experiment designed to estimate false-positive error rates. Not correcting for false-positive rates resulted in biased estimates of occupancy in 4 of the 10 anuran species we analysed, leading to an overestimation of the average number of occupied survey routes by as much as 70%.The framework we present for data collection and analysis is able to efficiently provide reliable inference for

  20. Size of the virtual source behind a convex spherical surface emitting a space charge limited ion current

    International Nuclear Information System (INIS)

    Chavet, I.

    1987-01-01

    A plasma source fitted with a circular orifice and emitting a space charge limited ion current can be made to operate with a convex spherical plasma boundary (meniscus) by appropriately adjusting its extraction parameters. In this case, the diameter of the virtual source behind the meniscus is much smaller than the orifice diameter. The effective value of this virtual source diameter depends significantly on various practical factors that are more or less controllable. Its lower ideal limit, however, depends only on the radio δ of the interelectrode distance to the meniscus curvature radius and on the ratio ω of the initial to final ion energy. This ideal limit is given for the ranges 0.1 ≤ δ ≤ 10 and 10 -7 ≤ ω ≤ 10 -3 . Preliminary experimental results are reported. (orig.)

  1. European inter-comparison of Monte Carlo codes users for the uncertainty calculation of the kerma in air beside a caesium-137 source; Intercomparaison europeenne d'utilisateurs de codes monte carlo pour le calcul d'incertitudes sur le kerma dans l'air aupres d'une source de cesium-137

    Energy Technology Data Exchange (ETDEWEB)

    De Carlan, L.; Bordy, J.M.; Gouriou, J. [CEA Saclay, LIST, Laboratoire National Henri Becquerel, Laboratoire de Metrologie de la Dose 91 - Gif-sur-Yvette (France)

    2010-07-01

    Within the frame of the CONRAD European project (Coordination Network for Radiation Dosimetry), and more precisely within a work group paying attention to uncertainty assessment in computational dosimetry and aiming at comparing different approaches, the authors report the simulation of an irradiator containing a caesium 137 source to calculate the kerma in air as well as its uncertainty due to different parameters. They present the problem geometry, recall the studied issues (kerma uncertainty, influence of capsule source, influence of the collimator, influence of the air volume surrounding the source). They indicate the codes which have been used (MNCP, Fluka, Penelope, etc.) and discuss the obtained results for the first issue

  2. Limitations of Phased Array Beamforming in Open Rotor Noise Source Imaging

    Science.gov (United States)

    Horvath, Csaba; Envia, Edmane; Podboy, Gary G.

    2013-01-01

    Phased array beamforming results of the F31/A31 historical baseline counter-rotating open rotor blade set were investigated for measurement data taken on the NASA Counter-Rotating Open Rotor Propulsion Rig in the 9- by 15-Foot Low-Speed Wind Tunnel of NASA Glenn Research Center as well as data produced using the LINPROP open rotor tone noise code. The planar microphone array was positioned broadside and parallel to the axis of the open rotor, roughly 2.3 rotor diameters away. The results provide insight as to why the apparent noise sources of the blade passing frequency tones and interaction tones appear at their nominal Mach radii instead of at the actual noise sources, even if those locations are not on the blades. Contour maps corresponding to the sound fields produced by the radiating sound waves, taken from the simulations, are used to illustrate how the interaction patterns of circumferential spinning modes of rotating coherent noise sources interact with the phased array, often giving misleading results, as the apparent sources do not always show where the actual noise sources are located. This suggests that a more sophisticated source model would be required to accurately locate the sources of each tone. The results of this study also have implications with regard to the shielding of open rotor sources by airframe empennages.

  3. REQUIREMENTS TO THE LIMITATION OF POPULATION EXPO-SURE FROM THE NATIRAL IONIZING IRRADIATION SOURCES IN INDUSTRIAL CONDITIONS

    Directory of Open Access Journals (Sweden)

    I. P. Stamat

    2010-01-01

    Full Text Available The paper presents conceptually new requirements to the limitation of population exposure from the natural ionizing irradiation sources in industrial conditions, introduced into Basic Sanitary Rules of Radiation Safety (OSPORB-99/2010. It is shown that, first of all, introduction of these requirements is aimed at the resolution of variety of previously existing serious contradictions in organization of radiation safety control and supervision for the impact of natural ionizing irradiation sources in industry.

  4. An approach to consider behavioral plasticity as a source of uncertainty when forecasting species' response to climate change.

    Science.gov (United States)

    Muñoz, Antonio-Román; Márquez, Ana Luz; Real, Raimundo

    2015-06-01

    The rapid ecological shifts that are occurring due to climate change present major challenges for managers and policymakers and, therefore, are one of the main concerns for environmental modelers and evolutionary biologists. Species distribution models (SDM) are appropriate tools for assessing the relationship between species distribution and environmental conditions, so being customarily used to forecast the biogeographical response of species to climate change. A serious limitation of species distribution models when forecasting the effects of climate change is that they normally assume that species behavior and climatic tolerances will remain constant through time. In this study, we propose a new methodology, based on fuzzy logic, useful for incorporating the potential capacity of species to adapt to new conditions into species distribution models. Our results demonstrate that it is possible to include different behavioral responses of species when predicting the effects of climate change on species distribution. Favorability models offered in this study show two extremes: one considering that the species will not modify its present behavior, and another assuming that the species will take full advantage of the possibilities offered by an increase in environmental favorability. This methodology may mean a more realistic approach to the assessment of the consequences of global change on species' distribution and conservation. Overlooking the potential of species' phenotypical plasticity may under- or overestimate the predicted response of species to changes in environmental drivers and its effects on species distribution. Using this approach, we could reinforce the science behind conservation planning in the current situation of rapid climate change.

  5. Estimation of Source Parameters of Historical Major Earthquakes from 1900 to 1970 around Asia and Analysis of Their Uncertainties

    Science.gov (United States)

    Han, J.; Zhou, S.

    2017-12-01

    Asia, located in the conjoined areas of Eurasian, Pacific, and Indo-Australian plates, is the continent with highest seismicity. Earthquake catalogue on the bases of modern seismic network recordings has been established since around 1970 in Asia and the earthquake catalogue before 1970 was much more inaccurate because of few stations. With a history of less than 50 years of modern earthquake catalogue, researches in seismology are quite limited. After the appearance of improved Earth velocity structure model, modified locating method and high-accuracy Optical Character Recognition technique, travel time data of earthquakes from 1900 to 1970 can be included in research and more accurate locations can be determined for historical earthquakes. Hence, parameters of these historical earthquakes can be obtained more precisely and some research method such as ETAS model can be used in a much longer time scale. This work focuses on the following three aspects: (1) Relocating more than 300 historical major earthquakes (M≥7.0) in Asia based on the Shide Circulars, International Seismological Summary and EHB Bulletin instrumental records between 1900 and 1970. (2) Calculating the focal mechanisms of more than 50 events by first motion records of P wave of ISS. (3) Based on the geological data, tectonic stress field and the result of relocation, inferring focal mechanisms of historical major earthquakes.

  6. Verification of uncertainty budgets

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  7. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  8. Pollen source and resource limitation to fruit production in the rare species Eremosparton songoricum (Fabaceae)

    Science.gov (United States)

    Eremosparton songoricum (Litv.) Vass. is a rare, central Asian desert species which shows lower fruit set and seed set (<16%) than most hermaphroditic species. We hypothesized that fruit production was limited by pollen and resources. To evaluate potential fruit abortion due to pollen limitation, su...

  9. Quantification of allyl hexanoate in pineapple beverages and yogurts as a case study to characterise a source of uncertainty in dietary exposure assessment to flavouring substances.

    Science.gov (United States)

    Raffo, A; D'Aloise, A; Magrì, A D; Leclercq, C

    2012-01-01

    One source of uncertainty in the estimation of dietary exposure to flavouring substances is the uncertainty in the occurrence and concentration levels of these substances naturally present or added to foodstuffs. The aim of this study was to assess the variability of concentration levels of allyl hexanoate, considered as a case study, in two main food categories to which it is often added: pineapple juice-based beverages and yogurts containing pineapple. Thirty-four beverages and 29 yogurts, with pineapple fruit or juice and added flavourings declared as ingredients on the package, were purchased from the local market (in Rome) and analysed. Analytical methods based on the stir bar sorptive extraction (SBSE) technique for the isolation of the target analyte, and on GC-MS analysis for final determination, were developed for the two food categories. In beverages, allyl hexanoate concentrations ranged from less than 0.01 to 16.71 mg l(-1), whereas in yogurts they ranged from 0.02 to 89.41 mg kg(-1). Average concentrations in beverages and yogurts with pineapple as the main fruit ingredient (1.91 mg l(-1) for beverages, 9.61 mg kg(-1) for yogurts) were in fair agreement with average use level data reported from industry surveys for the relevant food categories (4.5 and 6.0 mg kg(-1), respectively). Within the group of yogurts a single product was found to contain a level of allyl hexanoate more than 10-fold higher than the average reported use level. The screening techniques developed by the European Food Safety Authority (EFSA) using use level data provided by industry gave estimates of exposure that were of the same order of magnitude as the estimates obtained for regular consumers who would be loyal to the pineapple yogurt and beverage products containing the highest observed concentration of the substance of interest. In this specific case the uncertainty in the results obtained with the use of standard screening techniques for exposure assessment based on industry

  10. Uncertainty of climate change impacts and consequences on the prediction of future hydrological trends

    International Nuclear Information System (INIS)

    Minville, M.; Brissette, F.; Leconte, R.

    2008-01-01

    In the future, water is very likely to be the resource that will be most severely affected by climate change. It has been shown that small perturbations in precipitation frequency and/or quantity can result in significant impacts on the mean annual discharge. Moreover, modest changes in natural inflows result in larger changes in reservoir storage. There is however great uncertainty linked to changes in both the magnitude and direction of future hydrological trends. This presentation discusses the various sources of this uncertainty and their potential impact on the prediction of future hydrological trends. A companion paper will look at adaptation potential, taking into account some of the sources of uncertainty discussed in this presentation. Uncertainty is separated into two main components: climatic uncertainty and 'model and methods' uncertainty. Climatic uncertainty is linked to uncertainty in future greenhouse gas emission scenarios (GHGES) and to general circulation models (GCMs), whose representation of topography and climate processes is imperfect, in large part due to computational limitations. The uncertainty linked to natural variability (which may or may not increase) is also part of the climatic uncertainty. 'Model and methods' uncertainty regroups the uncertainty linked to the different approaches and models needed to transform climate data so that they can be used by hydrological models (such as downscaling methods) and the uncertainty of the models themselves and of their use in a changed climate. The impacts of the various sources of uncertainty on the hydrology of a watershed are demonstrated on the Peribonka River basin (Quebec, Canada). The results indicate that all sources of uncertainty can be important and outline the importance of taking these sources into account for any impact and adaptation studies. Recommendations are outlined for such studies. (author)

  11. Limited Impact of Setup and Range Uncertainties, Breathing Motion, and Interplay Effects in Robustly Optimized Intensity Modulated Proton Therapy for Stage III Non-small Cell Lung Cancer

    NARCIS (Netherlands)

    Inoue, Tatsuya; Widder, Joachim; van Dijk, Lisanne V; Takegawa, Hideki; Koizumi, Masahiko; Takashina, Masaaki; Usui, Keisuke; Kurokawa, Chie; Sugimoto, Satoru; Saito, Anneyuko I; Sasai, Keisuke; Van't Veld, Aart A; Langendijk, Johannes A; Korevaar, Erik W

    2016-01-01

    Purpose: To investigate the impact of setup and range uncertainties, breathing motion, and interplay effects using scanning pencil beams in robustly optimized intensity modulated proton therapy (IMPT) for stage III non-small cell lung cancer (NSCLC). Methods and Materials: Three-field IMPT plans

  12. 40 CFR Table 12 to Subpart Xxxx of... - Continuous Compliance With the Emission Limits for Tire Cord Production Affected Sources

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Continuous Compliance With the Emission Limits for Tire Cord Production Affected Sources 12 Table 12 to Subpart XXXX of Part 63 Protection of... Pollutants: Rubber Tire Manufacturing Pt. 63, Subpt. XXXX, Table 12 Table 12 to Subpart XXXX of Part 63...

  13. 40 CFR Table 10 to Subpart Xxxx of... - Continuous Compliance With the Emission Limits for Tire Production Affected Sources

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Continuous Compliance With the Emission Limits for Tire Production Affected Sources 10 Table 10 to Subpart XXXX of Part 63 Protection of... Pollutants: Rubber Tire Manufacturing Pt. 63, Subpt. XXXX, Table 10 Table 10 to Subpart XXXX of Part 63...

  14. 40 CFR Table 6 to Subpart Xxxx of... - Initial Compliance With the Emission Limits for Tire Production Affected Sources

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Initial Compliance With the Emission Limits for Tire Production Affected Sources 6 Table 6 to Subpart XXXX of Part 63 Protection of... Pollutants: Rubber Tire Manufacturing Pt. 63, Subpt. XXXX, Table 6 Table 6 to Subpart XXXX of Part 63—Initial...

  15. 40 CFR Table 7 to Subpart Xxxx of... - Initial Compliance With the Emission Limits for Tire Cord Production Affected Sources

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Initial Compliance With the Emission Limits for Tire Cord Production Affected Sources 7 Table 7 to Subpart XXXX of Part 63 Protection of... Pollutants: Rubber Tire Manufacturing Pt. 63, Subpt. XXXX, Table 7 Table 7 to Subpart XXXX of Part 63—Initial...

  16. Assessment of nuclear power sources in Czechoslovakia with respect to radiation protection limits

    International Nuclear Information System (INIS)

    Melichar, Z.

    1985-01-01

    The principles are presented which underlie the determination of limits of planned population exposure during normal operation of nuclear installations and of reference levels of exceptional population exposure during nuclear power plant accidents. The introduction is discussed of authorized limits and levels in Czechoslovakia the USSR, CMEA countries and Sweden. An estimate is made of the radiation burden of the population during the development of the Czechoslovak nuclear power programme. (E.S.)

  17. 77 FR 29167 - Effluent Limitations Guidelines and New Source Performance Standards for the Airport Deicing...

    Science.gov (United States)

    2012-05-16

    ... of drinking water sources (both surface and groundwater), creation of noxious odors and discolored... individual water bodies as the guidelines are developed; see Statement of Senator Muskie (October 4, 1972... biological process is contained in a sealed reactor, odors are eliminated. Based on EPA sampling results, the...

  18. Influence of current limitation on voltage stability with voltage sourced converter HVDC

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Jóhannsson, Hjörtur; Hansen, Anca Daniela

    2013-01-01

    A first study of voltage stability with relevant amount of Voltage Sourced Converter based High Voltage Direct Current (VSC-HVDC) transmission is presented, with particular focus on the converters’ behaviour when reaching their rated current. The detrimental effect of entering the current...

  19. Secrecy versus openness : Internet security and the limits of open source and peer production

    NARCIS (Netherlands)

    Schmidt, A.

    2014-01-01

    Open source and peer production have been praised as organisational models that could change the world for the better. It is commonly asserted that almost any societal activity could benefit from distributed, bottom-up collaboration — by making societal interaction more open, more social, and more

  20. Analysis of the rate setting system on Russia population exposure limitation by natural irradiation sources

    International Nuclear Information System (INIS)

    Stamat, I.P.

    2009-01-01

    In this work the condition of the legal securing of radiation safety system for the population of Russia under the influence of natural sources of radiation was analyzed. The system had been created during the latter 30 years. The ways of its improvement and harmonization according to the recommendations of authoritative organization were examined

  1. Monte Carlo modelling of impurity ion transport for a limiter source/sink

    International Nuclear Information System (INIS)

    Stangeby, P.C.; Farrell, C.; Hoskins, S.; Wood, L.

    1988-01-01

    In relating the impurity influx Φ I (0) (atoms per second) into a plasma from the edge to the central impurity ion density n I (0) (ions·m -3 ), it is necessary to know the value of τ I SOL , the average dwell time of impurity ions in the scrape-off layer. It is usually assumed that τ I SOL =L c /c s , the hydrogenic dwell time, where L c is the limiter connection length and c s is the hydrogenic ion acoustic speed. Monte Carlo ion transport results are reported here which show that, for a wall (uniform) influx, τ I SOL is longer than L c /c s , while for a limiter influx it is shorter. Thus for a limiter influx n I (0) is predicted to be smaller than the reference value. Impurities released from the limiter form ever large 'clouds' of successively higher ionization stages. These are reproduced by the Monte Carlo code as are the cloud shapes for a localized impurity injection far from the limiter. (author). 23 refs, 18 figs, 6 tabs

  2. An information-theoretic basis for uncertainty analysis: application to the QUASAR severe accident study

    International Nuclear Information System (INIS)

    Unwin, S.D.; Cazzoli, E.G.; Davis, R.E.; Khatib-Rahbar, M.; Lee, M.; Nourbakhsh, H.; Park, C.K.; Schmidt, E.

    1989-01-01

    The probabilistic characterization of uncertainty can be problematic in circumstances where there is a paucity of supporting data and limited experience on which to base engineering judgement. Information theory provides a framework in which to address this issue through reliance upon entropy-related principles of uncertainty maximization. We describe an application of such principles in the United States Nuclear Regulatory Commission-sponsored program QUASAR (Quantification and Uncertainty Analysis of Source Terms for Severe Accidents in Light Water Reactors). (author)

  3. Limitation of fusion power plant installation on future power grids under the effect of renewable and nuclear power sources

    Energy Technology Data Exchange (ETDEWEB)

    Takeda, Shutaro, E-mail: takeda.shutarou.55r@st.kyoto-u.ac.jp [Graduate School of Advanced Integrated Studies in Human Survivability, Kyoto University, Kyoto, Kyoto (Japan); Sakurai, Shigeki [Graduate School of Advanced Integrated Studies in Human Survivability, Kyoto University, Kyoto, Kyoto (Japan); Yamamoto, Yasushi [Faculty of Engineering Science, Kansai University, Suita, Osaka (Japan); Kasada, Ryuta; Konishi, Satoshi [Institute of Advanced Energy, Kyoto University, Uji, Kyoto (Japan)

    2016-11-01

    Graphical abstract: - Highlights: • Future power grids would be unstable due to renewable and nuclear power sources. • Output interruptions of fusion plant would cause disturbances to future grids. • Simulation results suggested they would create limitations in fusion installation. • A novel diagram was presented to illustrate this suggested limitation. - Abstract: Future power grids would be unstable because of the larger share of renewable and nuclear power sources. This instability might bring some additional difficulties to fusion plant installation. Therefore, the authors carried out a quantitative feasibility study from the aspect of grid stability through simulation. Results showed that the more renewable and nuclear sources are linked to a grid, the greater disturbance the grid experiences upon a sudden output interruption of a fusion power plant, e.g. plasma disruption. The frequency deviations surpassed 0.2 Hz on some grids, suggesting potential limitations of fusion plant installation on future grids. To clearly show the suggested limitations of fusion plant installations, a novel diagram was presented.

  4. IR Image upconversion using band-limited ASE illumination fiber sources.

    Science.gov (United States)

    Maestre, H; Torregrosa, A J; Capmany, J

    2016-04-18

    We study the field-of-view (FOV) of an upconversion imaging system that employs an Amplified Spontaneous Emission (ASE) fiber source to illuminate a transmission target. As an intermediate case between narrowband laser and thermal illumination, an ASE fiber source allows for higher spectral intensity than thermal illumination and still keeps a broad wavelength spectrum to take advantage of an increased non-collinear phase-matching angle acceptance that enlarges the FOV of the upconversion system when compared to using narrowband laser illumination. A model is presented to predict the angular acceptance of the upconverter in terms of focusing and ASE spectral width and allocation. The model is experimentally checked in case of 1550-630 nm upconversion.

  5. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  6. Electro-mechanical impact system excited by a source of limited power

    Czech Academy of Sciences Publication Activity Database

    Půst, Ladislav

    2008-01-01

    Roč. 15, č. 6 (2008), s. 1-10 ISSN 1802-1484 R&D Projects: GA ČR GA101/06/0063 Institutional research plan: CEZ:AV0Z20760514 Keywords : mechanical oscillations * impacts * limited power of exciter * electro-mechanical interaction Subject RIV: BI - Acoustics

  7. A study on the uncertainty based on Meteorological fields on Source-receptor Relationships for Total Nitrate in the Northeast Asia

    Science.gov (United States)

    Sunwoo, Y.; Park, J.; Kim, S.; Ma, Y.; Chang, I.

    2010-12-01

    Northeast Asia hosts more than one third of world population and the emission of pollutants trends to increase rapidly, because of economic growth and the increase of the consumption in high energy intensity. In case of air pollutants, especially, its characteristics of emissions and transportation become issued nationally, in terms of not only environmental aspects, but also long-range transboundary transportation. In meteorological characteristics, westerlies area means what air pollutants that emitted from China can be delivered to South Korea. Therefore, considering meteorological factors can be important to understand air pollution phenomena. In this study, we used MM5(Fifth-Generation Mesoscale Model) and WRF(Weather Research and Forecasting Model) to produce the meteorological fields. We analyzed the feature of physics option in each model and the difference due to characteristic of WRF and MM5. We are trying to analyze the uncertainty of source-receptor relationships for total nitrate according to meteorological fields in the Northeast Asia. We produced the each meteorological fields that apply the same domain, same initial and boundary conditions, the best similar physics option. S-R relationships in terms of amount and fractional number for total nitrate (sum of N from HNO3, nitrate and PAN) were calculated by EMEP method 3.

  8. MARS--a project of the diffraction-limited fourth generation X-ray source based on supermicrotron

    CERN Document Server

    Kulipanov, G N; Vinokurov, N A

    2001-01-01

    The new approach for the fourth generation X-ray source - Multiturn Accelerator-Recuperator Source (MARS) - was proposed recently. The installation consists of the radiofrequency (RF) multiturn accelerator (similar to the race-track microtron) and long undulator(s). After passing through the undulator(s) the electron beam is decelerated in the same RF accelerating structure. Such energy recovery reduces dramatically the radiation hazard and decreases the required RF power. In this paper we present a more detail explanation of this scheme, and specify further the parameter limitations and requirements for the accelerator.

  9. Planejamento agregado da produção ótimo com limite mínimo de estoque influenciado pelas incertezas de demanda Optimal aggregate production planning with minimum inventory limit affected by demand uncertainties

    Directory of Open Access Journals (Sweden)

    Oscar S. Silva Filho

    1995-04-01

    probability distribution function assumed as gaussian. Thus the problem studied here is a stochastic planning one with probabilistic constraint at the inventory level variable. It is shown that it is possible to obtain, by means of appropriate transformations, a deterministic equivalent formulation, for which an open-loop solution to stochastic problems can be generated, using any applicable mathematical programming algorithm. It is also shown that the uncertainties concerning demand fluctuation are explicitly presented in the deterministic formulation through a constraint function that represents the minimum inventory level limit or safety stock. This function is essentially concave and increases during time and depends on the variance of inventory level and probabilistic degree supply by the user to the inventory constraint. To illustrate the theoretical developments, a simple example of a single product system is proposed and solved by dynamic programming. The open- loop solution (i.e. approximate solution from an equivalent problem is compared with the true solution obtained directly from the stochastic problem via stochastic dynamic programming.

  10. Estimation of reactive power sources dynamic limits for Volt / VAr control

    International Nuclear Information System (INIS)

    Orozco Alvarado, Juan Jose

    2013-01-01

    A generic model of capacity curves is obtained from the theoretical capacity curves of the distribution generators and reactive compensation elements. The obtained generic model is structured in a simplified method of points, taking eight strategic points of two detailed curves of a generator and through a series of interpolations, achieving the estimation of limits of capacity of delivery / consumption of the generator. The theory of electric generation elements and reactive power compensation is reviewed. The curves of capacity 'Reactive Power / Active Power' are achieved for different values of tension: from wind generators with complete converter and doubly fed, photovoltaic generators with inverter and synchronous generators. The 'Reactive Power / Line Tension' capacity curves are acquired from static var compensators (SVC). The generic limits of generators and SVC are estimated from the capacity curves [es

  11. Statistical analysis of the limitation of half integer resonances on the available momentum acceptance of the High Energy Photon Source

    Energy Technology Data Exchange (ETDEWEB)

    Jiao, Yi, E-mail: jiaoyi@ihep.ac.cn; Duan, Zhe

    2017-01-01

    In a diffraction-limited storage ring, half integer resonances can have strong effects on the beam dynamics, associated with the large detuning terms from the strong focusing and strong sextupoles as required for an ultralow emittance. In this study, the limitation of half integer resonances on the available momentum acceptance (MA) was statistically analyzed based on one design of the High Energy Photon Source (HEPS). It was found that the probability of MA reduction due to crossing of half integer resonances is closely correlated with the level of beta beats at the nominal tunes, but independent of the error sources. The analysis indicated that for the presented HEPS lattice design, the rms amplitude of beta beats should be kept below 1.5% horizontally and 2.5% vertically to reach a small MA reduction probability of about 1%.

  12. Biomarker fingerprinting : application and limitations for source identification and correlation of oils and petroleum products

    International Nuclear Information System (INIS)

    Wang, Z.; Fingas, M.F.; Yang, C.; Hollebone, B.

    2004-01-01

    Biological markers or biomarkers are complex molecules originating from formerly living organisms. They are among the most important hydrocarbon groups in petroleum because every crude oil exhibits an essentially unique biomarker or fingerprint due to the wide variety of geological conditions under which oil is formed. When found in crude oils, rocks and sediments, biomarkers have the same structures as their parent organic molecules. Therefore, chemical analysis of source-characteristic and environmentally persistent biomarkers can provide valuable information in determining the source of spilled oil. Biomarkers can also be used to differentiate oils and to monitor the degradation process and the weathering state of oils under a range of conditions. The use of biomarker techniques to study oil spills has increased significantly in recent years. This paper provided case studies to demonstrate: (1) biomarker distribution in weathered oil and in petroleum products with similar chromatographic profiles, (2) sesquiterpenes and diamondoid biomarkers in oils and light petroleum products, (3) unique biomarker compounds in oils, (4) diagnostic ratios of biomarkers, and (5) biodegradation of biomarkers. It was noted that the trend to use biomarkers to study oil spills will continue. Continuous advances in analytical methods will further improve the application of oil hydrocarbon fingerprinting for environmental studies. 36 refs., 5 tabs., 12 figs

  13. Performance and limitations of positron emission tomography (PET) scanners for imaging very low activity sources.

    Science.gov (United States)

    Freedenberg, Melissa I; Badawi, Ramsey D; Tarantal, Alice F; Cherry, Simon R

    2014-02-01

    Emerging applications for positron emission tomography (PET) may require the ability to image very low activity source distributions in the body. The performance of clinical PET scanners in the regime where activity in the field of view is source in the NEMA scatter phantom), the BGO-based scanner significantly outperformed the LSO-based scanner. This was largely due to the effect of background counts emanating from naturally occurring but radioactive (176)Lu within the LSO detector material, which dominates the observed counting rate at the lowest activities. Increasing the lower energy threshold from 350 keV to 425 keV in an attempt to reduce this background did not significantly improve the measured NECR performance. The measured singles rate due to (176)Lu emissions within the scanner energy window was also found to be dependent on temperature, and to be affected by the operation of the CT component, making approaches to correct or compensate for the background more challenging. We conclude that for PET studies in a very low activity range, BGO-based scanners are likely to have better performance because of the lack of significant background. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. Evaluation of stormwater micropollutant source control and end-of-pipe control strategies using an uncertainty-calibrated integrated dynamic simulation model

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Sharma, Anitha Kumari; Ledin, Anna

    2015-01-01

    (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model......The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided...... by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals...

  15. Sensitivities and detection limits of X-ray fluorescence analysis with a 10 mCi241Am source

    International Nuclear Information System (INIS)

    Wundt, K.; Janghorbani, M.; Starke, K.

    1976-01-01

    Seven trace elements ranging from chromium to barium were preconcentrated on Amberlite IR-120 cation exchange paper and determined in an energy dispersive X-ray fluorescence system using a 10 mCi 241 Am source. Sensitivities were experimentally determined and checked with theoretically calculated values. The detection limits are compared with elemental levels present in typical surface waters and those allowed in drinking water. Appropriate conclusions as to feasibility of such a system for environmental monitoring are drawn. (orig.) [de

  16. Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Khuwaileh, B.A., E-mail: bakhuwai@ncsu.edu; Abdel-Khalik, H.S.

    2015-01-15

    Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.

  17. Estimation of source term in radiation emergencies from field measurements: its potential and limitations

    International Nuclear Information System (INIS)

    Hukkoo, R.K.; Bapat, V.N.

    1991-01-01

    During the 'early phase' of a radiation emergency the data on the nature and quantity of releases to asses the radiological impact may not be readily available thus delaying the initiation of necessary steps to contain the event and mitigate its effect. An iterative method based on the field measurements carried out at two concentric rings around the point of release is proposed to estimate the atmospheric release at the ground level and stack height. The program logic has been evaluated for internal consistency and its utility and limitations are discussed. (author). 8 figs., 4 tabs., 4 refs

  18. A LIMIT ON THE NUMBER OF ISOLATED NEUTRON STARS DETECTED IN THE ROSAT ALL-SKY-SURVEY BRIGHT SOURCE CATALOG

    International Nuclear Information System (INIS)

    Turner, Monica L.; Rutledge, Robert E.; Letcavage, Ryan; Shevchuk, Andrew S. H.; Fox, Derek B.

    2010-01-01

    Using new and archival observations made with the Swift satellite and other facilities, we examine 147 X-ray sources selected from the ROSAT All-Sky-Survey Bright Source Catalog (RASS/BSC) to produce a new limit on the number of isolated neutron stars (INSs) in the RASS/BSC, the most constraining such limit to date. Independent of X-ray spectrum and variability, the number of INSs is ≤48 (90% confidence). Restricting attention to soft (kT eff < 200 eV), non-variable X-ray sources-as in a previous study-yields an all-sky limit of ≤31 INSs. In the course of our analysis, we identify five new high-quality INS candidates for targeted follow-up observations. A future all-sky X-ray survey with eROSITA, or another mission with similar capabilities, can be expected to increase the detected population of X-ray-discovered INSs from the 8-50 in the BSC, to (for a disk population) 240-1500, which will enable a more detailed study of neutron star population models.

  19. Advancing Uncertainty: Untangling and Discerning Related Concepts

    Directory of Open Access Journals (Sweden)

    Janice Penrod

    2002-12-01

    Full Text Available Methods of advancing concepts within the qualitative paradigm have been developed and articulated. In this section, I describe methodological perspectives of a project designed to advance the concept of uncertainty using multiple qualitative methods. Through a series of earlier studies, the concept of uncertainty arose repeatedly in varied contexts, working its way into prominence, and warranting further investigation. Processes of advanced concept analysis were used to initiate the formal investigation into the meaning of the concept. Through concept analysis, the concept was deconstructed to identify conceptual components and gaps in understanding. Using this skeletal framework of the concept identified through concept analysis, subsequent studies were carried out to add ‘flesh’ to the concept. First, a concept refinement using the literature as data was completed. Findings revealed that the current state of the concept of uncertainty failed to incorporate what was known of the lived experience. Therefore, using interview techniques as the primary data source, a phenomenological study of uncertainty among caregivers was conducted. Incorporating the findings of the phenomenology, the skeletal framework of the concept was further fleshed out using techniques of concept correction to produce a more mature conceptualization of uncertainty. In this section, I describe the flow of this qualitative project investigating the concept of uncertainty, with special emphasis on a particular threat to validity (called conceptual tunnel vision that was identified and addressed during the phases of concept correction. Though in this article I employ a study of uncertainty for illustration, limited substantive findings regarding uncertainty are presented to retain a clear focus on the methodological issues.

  20. Distance limit for a class of model gamma-ray burst sources

    Science.gov (United States)

    Schmidt, W. K. H.

    1978-01-01

    It is pointed out that MeV photons have actually been observed in bursts. These observations imply that the nonrelativistic sources cannot be further away than a few kpc from the sun and, therefore, must be galactic. The 27 April 1972 event observed by Apollo 16 shows at higher energies a power law spectrum with a possible line feature around 4 MeV. The optical depth of a homogeneous, isotropic radiation field is estimated with the aid of formulae used by Nikishov (1962) and Jauch and Rohrlich (1955). On the basis of an investigation of the various factors involved, it is tentatively suggested that the gamma-ray bursts which have been detected are galactic, but are in the majority of the cases not connected with unique irreversible star transformation. It appears also unlikely that the gamma-ray bursts are connected with galactic novae.

  1. Developing open source, self-contained disease surveillance software applications for use in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Campbell Timothy C

    2012-09-01

    Full Text Available Abstract Background Emerging public health threats often originate in resource-limited countries. In recognition of this fact, the World Health Organization issued revised International Health Regulations in 2005, which call for significantly increased reporting and response capabilities for all signatory nations. Electronic biosurveillance systems can improve the timeliness of public health data collection, aid in the early detection of and response to disease outbreaks, and enhance situational awareness. Methods As components of its Suite for Automated Global bioSurveillance (SAGES program, The Johns Hopkins University Applied Physics Laboratory developed two open-source, electronic biosurveillance systems for use in resource-limited settings. OpenESSENCE provides web-based data entry, analysis, and reporting. ESSENCE Desktop Edition provides similar capabilities for settings without internet access. Both systems may be configured to collect data using locally available cell phone technologies. Results ESSENCE Desktop Edition has been deployed for two years in the Republic of the Philippines. Local health clinics have rapidly adopted the new technology to provide daily reporting, thus eliminating the two-to-three week data lag of the previous paper-based system. Conclusions OpenESSENCE and ESSENCE Desktop Edition are two open-source software products with the capability of significantly improving disease surveillance in a wide range of resource-limited settings. These products, and other emerging surveillance technologies, can assist resource-limited countries compliance with the revised International Health Regulations.

  2. Photosynthate supply and utilization in alfalfa: a developmental shift from a source to a sink limitation of photosynthesis

    International Nuclear Information System (INIS)

    Baysdorfer, C.; Bassham, J.A.

    1985-01-01

    Long-term carbon dioxide enrichment, 14 CO 2 feeding, and partial defoliation were employed as probes to investigate source/sink limitations of photosynthesis during the development of symbiotically grown alfalfa. In the mature crop, long-term CO 2 enrichment does not affect the rates of net photosynthesis, relative growth, 14 C export to nonphotosynthetic organs, or the rates of 14 C label incorporation into leaf sucrose, starch, or malate. The rate of glycolate labeling is, however, substantially reduced under these conditions. When the mature crop was partially defoliated, a considerable increase in net photosynthesis occurred in the remaining leaves. In the seedling crop, long-term CO 2 enrichment increased dry matter accumulation, primarily as a result of increases in leaf starch content. Although the higher rates of starch synthesis are not maintained, the growth enhancement of the enriched plants persisted throughout the experimental period. These results imply a source limitation of seedling photosynthesis and a sink limitation of photosynthesis in more mature plants. Consequently, both the supply and the utilization of photosynthate may limit seasonal photosynthesis in alfalfa

  3. Estimation of the uncertainties considered in NPP PSA level 2

    International Nuclear Information System (INIS)

    Kalchev, B.; Hristova, R.

    2005-01-01

    The main approaches of the uncertainties analysis are presented. The sources of uncertainties which should be considered in PSA level 2 for WWER reactor such as: uncertainties propagated from level 1 PSA; uncertainties in input parameters; uncertainties related to the modelling of physical phenomena during the accident progression and uncertainties related to the estimation of source terms are defined. The methods for estimation of the uncertainties are also discussed in this paper

  4. Optimum extracted H- and D- current densities from gas-pressure-limited high-power hydrogen/deuterium tandem ion sources

    International Nuclear Information System (INIS)

    Hiskes, J.R.

    1993-01-01

    The tandem hydrogen/deuterium ion source is modelled for the purpose of identifying the maximum current densities that can be extracted subject to the gas-pressure constraints proposed for contemporary beam-line systems. Optimum useful extracted current densities are found to be in the range of approximately 7 to 10 mA cm -2 . The sensitivity of these current densities is examined subject to uncertainties in the underlying atomic/molecular rate processes; A principal uncertainty remains the quantification of the molecular vibrational distribution following H 3 + wall collisions

  5. The deflection angle of a gravitational source with a global monopole in the strong field limit

    International Nuclear Information System (INIS)

    Cheng Hongbo; Man Jingyun

    2011-01-01

    We investigate the gravitational lensing effect in the strong field background around the Schwarzschild black hole with extremely small mass and solid deficit angle subject to the global monopole by means of the strong field limit issue. We obtain the angular position and magnification of the relativistic images and show that they relate to the global monopole parameter η. We discuss that with the increase of the parameter η, the minimum impact parameter u m and angular separation s increase and the relative magnification r decreases. We also find that s grows extremely as the increasing parameter η becomes large enough. The deflection angle will become larger when the parameter η grows. The effect from the solid deficit angle is the dependence of angular position, angular separation, relative magnification and deflection angle on the parameter η, which may offer a way to characterize some possible distinct signatures of the Schwarzschild black hole with a solid deficit angle associated with the global monopole.

  6. Principles for limiting exposure of the public to natural sources of radiation

    International Nuclear Information System (INIS)

    1984-01-01

    In a preliminary note a discussion is presented of the factors by which the values of Annual Limits on Intakes (ALI) and Derived Air Concentrations (DAC) recommended in ICRP Publication 30 for workers would differ from those that would be appropriate for members of the public. In Publication 39, the principles adopted distinguish between procedures for existing exposure situations, which can only be influenced by remedial action, and examples of future exposure situations which can be subject to administrative control (e.g. new house construction, reduction of ventilation in existing houses, production of building materials from new production facilities, water supplies from new facilities, burning natural gas from new wells, using fertiliser from new mills and factories). (U.K.)

  7. Sensitivity and uncertainty analyses applied to one-dimensional radionuclide transport in a layered fractured rock: Evaluation of the Limit State approach, Iterative Performance Assessment, Phase 2

    International Nuclear Information System (INIS)

    Wu, Y.T.; Gureghian, A.B.; Sagar, B.; Codell, R.B.

    1992-12-01

    The Limit State approach is based on partitioning the parameter space into two parts: one in which the performance measure is smaller than a chosen value (called the limit state), and the other in which it is larger. Through a Taylor expansion at a suitable point, the partitioning surface (called the limit state surface) is approximated as either a linear or quadratic function. The success and efficiency of the limit state method depends upon choosing an optimum point for the Taylor expansion. The point in the parameter space that has the highest probability of producing the value chosen as the limit state is optimal for expansion. When the parameter space is transformed into a standard Gaussian space, the optimal expansion point, known as the lost Probable Point (MPP), has the property that its location on the Limit State surface is closest to the origin. Additionally, the projections onto the parameter axes of the vector from the origin to the MPP are the sensitivity coefficients. Once the MPP is determined and the Limit State surface approximated, formulas (see Equations 4-7 and 4-8) are available for determining the probability of the performance measure being less than the limit state. By choosing a succession of limit states, the entire cumulative distribution of the performance measure can be detemined. Methods for determining the MPP and also for improving the estimate of the probability are discussed in this report

  8. Decommissioning funding: ethics, implementation, uncertainties

    International Nuclear Information System (INIS)

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  9. GYRO-ORBIT SIZE, BRIGHTNESS TEMPERATURE LIMIT, AND IMPLAUSIBILITY OF COHERENT EMISSION BY BUNCHING IN SYNCHROTRON RADIO SOURCES

    International Nuclear Information System (INIS)

    Singal, Ashok K.

    2012-01-01

    We show that an upper limit on the maximum brightness temperature for a self-absorbed incoherent synchrotron radio source is obtained from the size of its gyro orbits, which in turn must lie well within the confines of the total source extent. These temperature limits are obtained without recourse to inverse Compton effects or the condition of equipartition of energy between magnetic fields and relativistic particles. For radio variables, the intra-day variability implies brightness temperatures ∼10 19 K in the comoving rest frame of the source. This, if interpreted purely due to an incoherent synchrotron emission, would imply gyroradii >10 28 cm, the size of the universe, while from the causality arguments the inferred maximum size of the source in such a case is ∼ 15 cm. Such high brightness temperatures are sometimes modeled in the literature as some coherent emission process where bunches of non-thermal particles are somehow formed that radiate in phase. We show that, unlike in the case of curvature radiation models proposed in pulsars, in the synchrotron radiation mechanism the oppositely charged particles would contribute together to the coherent phenomenon without the need to form separate bunches of the opposite charges. At the same time we show that bunches would disperse over dimensions larger than a wavelength in time shorter than the gyro orbital period (∼< 0.1 s). Therefore, a coherent emission by bunches cannot be a plausible explanation of the high brightness temperatures inferred in extragalactic radio sources showing variability over a few hours or longer.

  10. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  11. Model structures amplify uncertainty in predicted soil carbon responses to climate change.

    Science.gov (United States)

    Shi, Zheng; Crowell, Sean; Luo, Yiqi; Moore, Berrien

    2018-06-04

    Large model uncertainty in projected future soil carbon (C) dynamics has been well documented. However, our understanding of the sources of this uncertainty is limited. Here we quantify the uncertainties arising from model parameters, structures and their interactions, and how those uncertainties propagate through different models to projections of future soil carbon stocks. Both the vertically resolved model and the microbial explicit model project much greater uncertainties to climate change than the conventional soil C model, with both positive and negative C-climate feedbacks, whereas the conventional model consistently predicts positive soil C-climate feedback. Our findings suggest that diverse model structures are necessary to increase confidence in soil C projection. However, the larger uncertainty in the complex models also suggests that we need to strike a balance between model complexity and the need to include diverse model structures in order to forecast soil C dynamics with high confidence and low uncertainty.

  12. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  13. Identifying and Analyzing Uncertainty Structures in the TRMM Microwave Imager Precipitation Product over Tropical Ocean Basins

    Science.gov (United States)

    Liu, Jianbo; Kummerow, Christian D.; Elsaesser, Gregory S.

    2016-01-01

    Despite continuous improvements in microwave sensors and retrieval algorithms, our understanding of precipitation uncertainty is quite limited, due primarily to inconsistent findings in studies that compare satellite estimates to in situ observations over different parts of the world. This study seeks to characterize the temporal and spatial properties of uncertainty in the Tropical Rainfall Measuring Mission Microwave Imager surface rainfall product over tropical ocean basins. Two uncertainty analysis frameworks are introduced to qualitatively evaluate the properties of uncertainty under a hierarchy of spatiotemporal data resolutions. The first framework (i.e. 'climate method') demonstrates that, apart from random errors and regionally dependent biases, a large component of the overall precipitation uncertainty is manifested in cyclical patterns that are closely related to large-scale atmospheric modes of variability. By estimating the magnitudes of major uncertainty sources independently, the climate method is able to explain 45-88% of the monthly uncertainty variability. The percentage is largely resolution dependent (with the lowest percentage explained associated with a 1 deg x 1 deg spatial/1 month temporal resolution, and highest associated with a 3 deg x 3 deg spatial/3 month temporal resolution). The second framework (i.e. 'weather method') explains regional mean precipitation uncertainty as a summation of uncertainties associated with individual precipitation systems. By further assuming that self-similar recurring precipitation systems yield qualitatively comparable precipitation uncertainties, the weather method can consistently resolve about 50 % of the daily uncertainty variability, with only limited dependence on the regions of interest.

  14. Physiology of Aspergillus niger in Oxygen-Limited Continuous Cultures: Influence of Aeration, Carbon Source Concentration and Dilution Rate

    DEFF Research Database (Denmark)

    Diano, Audrey; Peeters, J.; Dynesen, Jens Østergaard

    2009-01-01

    In industrial production of enzymes using the filamentous fungus Aspergilhis niger supply of sufficient oxygen is often a limitation, resulting in the formation of by-products such as polyols. In order to identify the mechanisms behind formation of the different by-products we studied the effect...... of low oxygen availability, at different carbon source concentrations and at different specific growth rates, on the metabolism of A. niger, using continuous cultures. The results show that there is an increase in the production of tricarboxylic acid (TCA) cycle intermediates at low oxygen concentrations...

  15. INTERCOMPARISON ON THE MEASUREMENT OF THE QUANTITY PERSONAL DOSE EQUIVALENT HP(10) IN PHOTON FIELDS. LINEARITY DEPENDENCE, LOWER LIMIT OF DETECTION AND UNCERTAINTY IN MEASUREMENT OF DOSIMETRY SYSTEMS OF INDIVIDUAL MONITORING SERVICES IN GABON AND GHANA.

    Science.gov (United States)

    Ondo Meye, P; Schandorf, C; Amoako, J K; Manteaw, P O; Amoatey, E A; Adjei, D N

    2017-12-01

    An inter-comparison study was conducted to assess the capability of dosimetry systems of individual monitoring services (IMSs) in Gabon and Ghana to measure personal dose equivalent Hp(10) in photon fields. The performance indicators assessed were the lower limit of detection, linearity and uncertainty in measurement. Monthly and quarterly recording levels were proposed with corresponding values of 0.08 and 0.025 mSv, and 0.05 and 0.15 mSv for the TLD and OSL systems, respectively. The linearity dependence of the dosimetry systems was performed following the requirement given in the Standard IEC 62387 of the International Electrotechnical Commission (IEC). The results obtained for the two systems were satisfactory. The procedure followed for the uncertainty assessment is the one given in the IEC technical report TR62461. The maximum relative overall uncertainties, in absolute value, expressed in terms of Hp(10), for the TL dosimetry system Harshaw 6600, are 44. 35% for true doses below 0.40 mSv and 36.33% for true doses ≥0.40 mSv. For the OSL dosimetry system microStar, the maximum relative overall uncertainties, in absolute value, are 52.17% for true doses below 0.40 mSv and 37.43% for true doses ≥0.40 mSv. These results are in good agreement with the requirements for accuracy of the International Commission on Radiological protection. When expressing the uncertainties in terms of response, comparison with the IAEA requirements for overall accuracy showed that the uncertainty results were also acceptable. The values of Hp(10) directly measured by the two dosimetry systems showed a significant underestimation for the Harshaw 6600 system, and a slight overestimation for the microStar system. After correction for linearity of the measured doses, the two dosimetry systems gave better and comparable results. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Eliciting and Combining Decision Criteria Using a Limited Palette of Utility Functions and Uncertainty Distributions: Illustrated by Application to Pest Risk Analysis.

    Science.gov (United States)

    Holt, Johnson; Leach, Adrian W; Schrader, Gritta; Petter, Françoise; MacLeod, Alan; van der Gaag, Dirk Jan; Baker, Richard H A; Mumford, John D

    2014-01-01

    Utility functions in the form of tables or matrices have often been used to combine discretely rated decision-making criteria. Matrix elements are usually specified individually, so no one rule or principle can be easily stated for the utility function as a whole. A series of five matrices are presented that aggregate criteria two at a time using simple rules that express a varying degree of constraint of the lower rating over the higher. A further nine possible matrices were obtained by using a different rule either side of the main axis of the matrix to describe situations where the criteria have a differential influence on the outcome. Uncertainties in the criteria are represented by three alternative frequency distributions from which the assessors select the most appropriate. The output of the utility function is a distribution of rating frequencies that is dependent on the distributions of the input criteria. In pest risk analysis (PRA), seven of these utility functions were required to mimic the logic by which assessors for the European and Mediterranean Plant Protection Organization arrive at an overall rating of pest risk. The framework enables the development of PRAs that are consistent and easy to understand, criticize, compare, and change. When tested in workshops, PRA practitioners thought that the approach accorded with both the logic and the level of resolution that they used in the risk assessments. © 2013 Society for Risk Analysis.

  17. Spectroscopic imaging of limiter heat and particle fluxes and the resulting impurity sources during Wendelstein 7-X startup plasmas.

    Science.gov (United States)

    Stephey, L; Wurden, G A; Schmitz, O; Frerichs, H; Effenberg, F; Biedermann, C; Harris, J; König, R; Kornejew, P; Krychowiak, M; Unterberg, E A

    2016-11-01

    A combined IR and visible camera system [G. A. Wurden et al., "A high resolution IR/visible imaging system for the W7-X limiter," Rev. Sci. Instrum. (these proceedings)] and a filterscope system [R. J. Colchin et al., Rev. Sci. Instrum. 74, 2068 (2003)] were implemented together to obtain spectroscopic data of limiter and first wall recycling and impurity sources during Wendelstein 7-X startup plasmas. Both systems together provided excellent temporal and spatial spectroscopic resolution of limiter 3. Narrowband interference filters in front of the camera yielded C-III and H α photon flux, and the filterscope system provided H α , H β , He-I, He-II, C-II, and visible bremsstrahlung data. The filterscopes made additional measurements of several points on the W7-X vacuum vessel to yield wall recycling fluxes. The resulting photon flux from both the visible camera and filterscopes can then be compared to an EMC3-EIRENE synthetic diagnostic [H. Frerichs et al., "Synthetic plasma edge diagnostics for EMC3-EIRENE, highlighted for Wendelstein 7-X," Rev. Sci. Instrum. (these proceedings)] to infer both a limiter particle flux and wall particle flux, both of which will ultimately be used to infer the complete particle balance and particle confinement time τ P .

  18. Spectroscopic imaging of limiter heat and particle fluxes and the resulting impurity sources during Wendelstein 7-X startup plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Stephey, L., E-mail: stephey@wisc.edu; Schmitz, O.; Frerichs, H.; Effenberg, F. [University of Wisconsin–Madison, Madison, Wisconsin 53706 (United States); Wurden, G. A. [Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); Biedermann, C.; König, R.; Kornejew, P.; Krychowiak, M. [Max-Planck-Institut für Plasma Physik, Wendelsteinstrasse 1, 17491 Greifswald (Germany); Harris, J.; Unterberg, E. A. [Oak Ridge National Laboratory, Oak Ridge, Tennessee 37831 (United States)

    2016-11-15

    A combined IR and visible camera system [G. A. Wurden et al., “A high resolution IR/visible imaging system for the W7-X limiter,” Rev. Sci. Instrum. (these proceedings)] and a filterscope system [R. J. Colchin et al., Rev. Sci. Instrum. 74, 2068 (2003)] were implemented together to obtain spectroscopic data of limiter and first wall recycling and impurity sources during Wendelstein 7-X startup plasmas. Both systems together provided excellent temporal and spatial spectroscopic resolution of limiter 3. Narrowband interference filters in front of the camera yielded C-III and H{sub α} photon flux, and the filterscope system provided H{sub α}, H{sub β}, He-I, He-II, C-II, and visible bremsstrahlung data. The filterscopes made additional measurements of several points on the W7-X vacuum vessel to yield wall recycling fluxes. The resulting photon flux from both the visible camera and filterscopes can then be compared to an EMC3-EIRENE synthetic diagnostic [H. Frerichs et al., “Synthetic plasma edge diagnostics for EMC3-EIRENE, highlighted for Wendelstein 7-X,” Rev. Sci. Instrum. (these proceedings)] to infer both a limiter particle flux and wall particle flux, both of which will ultimately be used to infer the complete particle balance and particle confinement time τ{sub P}.

  19. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  20. The Open Source Stochastic Building Simulation Tool SLBM and Its Capabilities to Capture Uncertainty of Policymaking in the U.S. Building Sector

    Energy Technology Data Exchange (ETDEWEB)

    Stadler, Michael; Marnay, Chris; Azevedo, Ines Lima; Komiyama, Ryoichi; Lai, Judy

    2009-05-14

    The increasing concern about climate change as well as the expected direct environmental economic impacts of global warming will put considerable constraints on the US building sector, which consumes roughly 48percent of the total primary energy, making it the biggest single source of CO2 emissions. It is obvious that the battle against climate change can only be won by considering innovative building approaches and consumer behaviors and bringing new, effective low carbon technologies to the building / consumer market. However, the limited time given to mitigate climate change is unforgiving to misled research and / or policy. This is the reason why Lawrence Berkeley National Lab is working on an open source long range Stochastic Lite Building Module (SLBM) to estimate the impact of different policies and consumer behavior on the market penetration of low carbon building technologies. SLBM is designed to be a fast running, user-friendly model that analysts can readily run and modify in its entirety through a visual interface. The tool is fundamentally an engineering-economic model with technology adoption decisions based on cost and energy performance characteristics of competing technologies. It also incorporates consumer preferences and passive building systems as well as interactions between technologies (such as internal heat gains). Furthermore, everything is based on service demand, e.g. a certain temperature or luminous intensity, instead of energy intensities. The core objectives of this paper are to demonstrate the practical approach used, to start a discussion process between relevant stakeholders and to build collaborations.

  1. Sources of uncertainties in OSL dating of archaeological mortars: The case study of the Roman amphitheatre “Palais-Gallien” in Bordeaux

    International Nuclear Information System (INIS)

    Urbanova, P.; Hourcade, D.; Ney, C.; Guibert, P.

    2015-01-01

    Archaeological mortars are more convenient and much more representative for the chronology of buildings than brick or wood constructions that can be re-used from older buildings. Before dating unknown samples of mortars, further investigation of OSL from mortars is required and the most efficient methodology needs to be established. In this study we compared the ages obtained by OSL dating of quartz extracted from mortars of the Roman amphitheatre Palais-Gallien in Bordeaux with independent age information. Resetting of the OSL signal occurred during the preparation of mortar when grains of sand (quartz) were extracted and mixed with lime and water. The mortar was subsequently hidden from light by embedding within the structure which is the event to be dated. Various factors contribute to uncertainties in the age determination. The frequency of measured equivalent doses reveals a large scattering. Optical bleaching of certain grains can be partial due to the short duration of the exposure to light. We worked with the single grain technique in order to find and select the grains that were sufficiently exposed to daylight. To determine the average equivalent dose, we tried three different approaches: a calculation of an arithmetic mean and one following either the central age model or the 3-parameter minimum age model, the latter turned out to be the only relevant way to evaluate the experimental data. The proportion of grains included in the calculation of the average equivalent dose represents 2.7–4.7 % of the overall analysed grains. The results obtained for the three out of four samples are approaching the expected age, however, the minimum doses and the corresponding ages are significantly over-estimated in case of two samples. The studied material is very coarse, which causes heterogeneity of irradiation at the single grain scale, and contributes to a dispersion of equivalent doses. Different analytical methods (scanning electron microscopy with energy

  2. Evaluation Procedures of Random Uncertainties in Theoretical Calculations of Cross Sections and Rate Coefficients

    International Nuclear Information System (INIS)

    Kokoouline, V.; Richardson, W.

    2014-01-01

    Uncertainties in theoretical calculations may include: • Systematic uncertainty: Due to applicability limits of the chosen model. • Random: Within a model, uncertainties of model parameters result in uncertainties of final results (such as cross sections). • If uncertainties of experimental and theoretical data are known, for the purpose of data evaluation (to produce recommended data), one should combine two data sets to produce the best guess data with the smallest possible uncertainty. In many situations, it is possible to assess the accuracy of theoretical calculations because theoretical models usually rely on parameters that are uncertain, but not completely random, i.e. the uncertainties of the parameters of the models are approximately known. If there are one or several such parameters with corresponding uncertainties, even if some or all parameters are correlated, the above approach gives a conceptually simple way to calculate uncertainties of final cross sections (uncertainty propagation). Numerically, the statistical approach to the uncertainty propagation could be computationally expensive. However, in situations, where uncertainties are considered to be as important as the actual cross sections (for data validation or benchmark calculations, for example), such a numerical effort is justified. Having data from different sources (say, from theory and experiment), a systematic statistical approach allows one to compare the data and produce “unbiased” evaluated data with improved uncertainties, if uncertainties of initial data from different sources are available. Without uncertainties, the data evaluation/validation becomes impossible. This is the reason why theoreticians should assess the accuracy of their calculations in one way or another. A statistical and systematic approach, similar to the described above, is preferable.

  3. Intrinsic position uncertainty impairs overt search performance.

    Science.gov (United States)

    Semizer, Yelda; Michel, Melchi M

    2017-08-01

    Uncertainty regarding the position of the search target is a fundamental component of visual search. However, due to perceptual limitations of the human visual system, this uncertainty can arise from intrinsic, as well as extrinsic, sources. The current study sought to characterize the role of intrinsic position uncertainty (IPU) in overt visual search and to determine whether it significantly limits human search performance. After completing a preliminary detection experiment to characterize sensitivity as a function of visual field position, observers completed a search task that required localizing a Gabor target within a field of synthetic luminance noise. The search experiment included two clutter conditions designed to modulate the effect of IPU across search displays of varying set size. In the Cluttered condition, the display was tiled uniformly with feature clutter to maximize the effects of IPU. In the Uncluttered condition, the clutter at irrelevant locations was removed to attenuate the effects of IPU. Finally, we derived an IPU-constrained ideal searcher model, limited by the IPU measured in human observers. Ideal searchers were simulated based on the detection sensitivity and fixation sequences measured for individual human observers. The IPU-constrained ideal searcher predicted performance trends similar to those exhibited by the human observers. In the Uncluttered condition, performance decreased steeply as a function of increasing set size. However, in the Cluttered condition, the effect of IPU dominated and performance was approximately constant as a function of set size. Our findings suggest that IPU substantially limits overt search performance, especially in crowded displays.

  4. Uncertainties in planned dose due to the limited voxel size of the planning CT when treating lung tumors with proton therapy

    International Nuclear Information System (INIS)

    Espana, Samuel; Paganetti, Harald

    2011-01-01

    Dose calculation for lung tumors can be challenging due to the low density and the fine structure of the geometry. The latter is not fully considered in the CT image resolution used in treatment planning causing the prediction of a more homogeneous tissue distribution. In proton therapy, this could result in predicting an unrealistically sharp distal dose falloff, i.e. an underestimation of the distal dose falloff degradation. The goal of this work was the quantification of such effects. Two computational phantoms resembling a two-dimensional heterogeneous random lung geometry and a swine lung were considered applying a variety of voxel sizes for dose calculation. Monte Carlo simulations were used to compare the dose distributions predicted with the voxel size typically used for the treatment planning procedure with those expected to be delivered using the finest resolution. The results show, for example, distal falloff position differences of up to 4 mm between planned and expected dose at the 90% level for the heterogeneous random lung (assuming treatment plan on a 2 x 2 x 2.5 mm 3 grid). For the swine lung, differences of up to 38 mm were seen when airways are present in the beam path when the treatment plan was done on a 0.8 x 0.8 x 2.4 mm 3 grid. The two-dimensional heterogeneous random lung phantom apparently does not describe the impact of the geometry adequately because of the lack of heterogeneities in the axial direction. The differences observed in the swine lung between planned and expected dose are presumably due to the poor axial resolution of the CT images used in clinical routine. In conclusion, when assigning margins for treatment planning for lung cancer, proton range uncertainties due to the heterogeneous lung geometry and CT image resolution need to be considered.

  5. Uncertainty Management and Sensitivity Analysis

    DEFF Research Database (Denmark)

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  6. 40 CFR Table 1 to Subpart Oooo of... - Emission Limits for New or Reconstructed and Existing Affected Sources in the Printing, Coating...

    Science.gov (United States)

    2010-07-01

    ... Reconstructed and Existing Affected Sources in the Printing, Coating and Dyeing of Fabrics and Other Textiles... SOURCE CATEGORIES National Emission Standards for Hazardous Air Pollutants: Printing, Coating, and Dyeing...—Emission Limits for New or Reconstructed and Existing Affected Sources in the Printing, Coating and Dyeing...

  7. Multi-source analysis reveals latitudinal and altitudinal shifts in range of Ixodes ricinus at its northern distribution limit

    Directory of Open Access Journals (Sweden)

    Kristoffersen Anja B

    2011-05-01

    Full Text Available Abstract Background There is increasing evidence for a latitudinal and altitudinal shift in the distribution range of Ixodes ricinus. The reported incidence of tick-borne disease in humans is on the rise in many European countries and has raised political concern and attracted media attention. It is disputed which factors are responsible for these trends, though many ascribe shifts in distribution range to climate changes. Any possible climate effect would be most easily noticeable close to the tick's geographical distribution limits. In Norway- being the northern limit of this species in Europe- no documentation of changes in range has been published. The objectives of this study were to describe the distribution of I. ricinus in Norway and to evaluate if any range shifts have occurred relative to historical descriptions. Methods Multiple data sources - such as tick-sighting reports from veterinarians, hunters, and the general public - and surveillance of human and animal tick-borne diseases were compared to describe the present distribution of I. ricinus in Norway. Correlation between data sources and visual comparison of maps revealed spatial consistency. In order to identify the main spatial pattern of tick abundance, a principal component analysis (PCA was used to obtain a weighted mean of four data sources. The weighted mean explained 67% of the variation of the data sources covering Norway's 430 municipalities and was used to depict the present distribution of I. ricinus. To evaluate if any geographical range shift has occurred in recent decades, the present distribution was compared to historical data from 1943 and 1983. Results Tick-borne disease and/or observations of I. ricinus was reported in municipalities up to an altitude of 583 metres above sea level (MASL and is now present in coastal municipalities north to approximately 69°N. Conclusion I. ricinus is currently found further north and at higher altitudes than described in

  8. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  9. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  10. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  11. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  12. e-Assessment in a Limited-Resources Dental School Using an Open-Source Learning Management System.

    Science.gov (United States)

    El Tantawi, Maha M A; Abdelsalam, Maha M; Mourady, Ahmed M; Elrifae, Ismail M B

    2015-05-01

    e-Assessment provides solutions to some problems encountered in dental students' evaluation. The aim of this study was to evaluate the experience of a limited-resources dental school with e-assessment provided through an open-source learning management system (LMS). Data about users' access and types of e-assessment activities at the Faculty of Dentistry, Alexandria University, Egypt, were obtained from the web-based LMS Moodle. A questionnaire developed to assess students' perceptions of the e-assessment was also sent to students registered in two courses (undergraduate and postgraduate) with the same instructor. The results showed that most e-courses at the school had one form of e-assessment (82%) and, of these, 16.7% had summative assessment activities. There were significant differences among departments in the number of e-courses with e-assessment. One-quarter of e-courses with e-assessment used Moodle quizzes. Of 285 students registered in the two courses that included the questionnaire, 170 responded (response rate=59.6%). The responding students positively perceived the impact of e-assessment on learning and its reliability and security, whereas technical issues and related stresses were negatively perceived. This study suggests that e-assessment can be used at minimal cost in dental schools with limited resources and large class sizes with the least demands on faculty members and teaching staff time. For these schools, an open-source LMS such as Moodle provides formative e-assessment not available otherwise and accommodates various question formats and varying levels of instructors' technical skills. These students seemed to have a positive impression of the e-assessment although technical problems and related stresses are issues that need to be addressed.

  13. A solubility-limited-source-term model for the geological disposal of cemented intermediate-level waste

    International Nuclear Information System (INIS)

    Robinson, P.C.; Hodgkinson, D.P.; Tasker, P.W.; Lever, D.A.; Windsor, M.E.; Grime, P.W.; Herbert, A.W.

    1988-01-01

    This paper presents and illustrates the use of a source-team model for an intermediate-level radioactive-waste repository. The model deals with the behaviour of long-lived nuclides after the initial containment period. The major processes occurring in the near-field are included, namely sorption, elemental solubility limits, chain decay and transport due to groundwater flow. The model is applied to a realistic example of ILW disposal. From this it is clear that some nuclides are present in sufficient quantities to reach their solubility limit even when the assumed sorption coefficients are large. For these nuclides the precise sorption coefficient is unimportant. It is also clear that some daughter products, in particular Pb-210, become significant. The toxicity of the repository porewater is calculated and it is shown that, although this toxicity is high compared to levels acceptable in drinking water, it is much lower than the toxicity of the waste itself. However, the near-field chemical environment is only one of a number of containment barriers. In addition, it has been shown that the rate at which radionuclides enter the rock surrounding the repository is very low. (author)

  14. Uncertainty in BMP evaluation and optimization for watershed management

    Science.gov (United States)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT

  15. [Sources of information on suicide attempts in the Nord - Pas-de-Calais (France). Contributions and limitations].

    Science.gov (United States)

    Plancke, L; Ducrocq, F; Clément, G; Chaud, P; Haeghebaert, S; Amariei, A; Chan-Chee, C; Goldstein, P; Vaiva, G

    2014-12-01

    There are very few permanent indicators of mental health in France; suicidal behavior is often only understood on the basis of deaths by suicide. The epidemiological interest and methodological limits of four medico-administrative databases from which data on suicide attempts can be extracted have been the subject of a study in the Nord - Pas-de-Calais Region of France: telephone calls for emergency medical assistance after suicide attempt (2009 to 2011), admissions in emergency services with a diagnosis of suicide attempt (2012), medical-surgical hospital admissions as a result of suicide attempt (2009 to 2011), and psychiatric admissions with a diagnosis of suicide attempt (2011). Usable data were provided by one of two emergency medical assistance units, five of thirty emergency departments and all medical-surgical and psychiatric units; in data from the latter two sources, a unique anonymous identifier gave individual statistics, while the first two covered only suicide attempts. In 2011, the number of suicide attempt calls per 100,000 inhabitants was 304, whereas the number of hospitalisations with this diagnosis was 275; rates are highest in men between 20 and 49 years of age, and in women below 20 years of age and between 40 and 49. Sources are seen to be very homogeneous with regards to the average age at which suicide took place (between 37.8 and 38.5 years, depending on the source), and to the sex (55.0% to 57.6% of women). In 2011, the number of patients with a diagnosis of suicide attempt treated in psychiatry is 2.6 times lower than the number hospitalised for suicide attempt in medical-surgical units (3563 vs 9327). Permanent gathering of data, and the large volume of data recorded, should encourage the use of these databases in the definition and assessment of mental health policy: an increased contribution from emergency call centers and emergency services, and the coding of the suicidal nature of intoxications by a few clearly under-declaring units

  16. Method for estimating effects of unknown correlations in spectral irradiance data on uncertainties of spectrally integrated colorimetric quantities

    Science.gov (United States)

    Kärhä, Petri; Vaskuri, Anna; Mäntynen, Henrik; Mikkonen, Nikke; Ikonen, Erkki

    2017-08-01

    Spectral irradiance data are often used to calculate colorimetric properties, such as color coordinates and color temperatures of light sources by integration. The spectral data may contain unknown correlations that should be accounted for in the uncertainty estimation. We propose a new method for estimating uncertainties in such cases. The method goes through all possible scenarios of deviations using Monte Carlo analysis. Varying spectral error functions are produced by combining spectral base functions, and the distorted spectra are used to calculate the colorimetric quantities. Standard deviations of the colorimetric quantities at different scenarios give uncertainties assuming no correlations, uncertainties assuming full correlation, and uncertainties for an unfavorable case of unknown correlations, which turn out to be a significant source of uncertainty. With 1% standard uncertainty in spectral irradiance, the expanded uncertainty of the correlated color temperature of a source corresponding to the CIE Standard Illuminant A may reach as high as 37.2 K in unfavorable conditions, when calculations assuming full correlation give zero uncertainty, and calculations assuming no correlations yield the expanded uncertainties of 5.6 K and 12.1 K, with wavelength steps of 1 nm and 5 nm used in spectral integrations, respectively. We also show that there is an absolute limit of 60.2 K in the error of the correlated color temperature for Standard Illuminant A when assuming 1% standard uncertainty in the spectral irradiance. A comparison of our uncorrelated uncertainties with those obtained using analytical methods by other research groups shows good agreement. We re-estimated the uncertainties for the colorimetric properties of our 1 kW photometric standard lamps using the new method. The revised uncertainty of color temperature is a factor of 2.5 higher than the uncertainty assuming no correlations.

  17. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  18. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  19. Improvement of uncertainty relations for mixed states

    International Nuclear Information System (INIS)

    Park, Yong Moon

    2005-01-01

    We study a possible improvement of uncertainty relations. The Heisenberg uncertainty relation employs commutator of a pair of conjugate observables to set the limit of quantum measurement of the observables. The Schroedinger uncertainty relation improves the Heisenberg uncertainty relation by adding the correlation in terms of anti-commutator. However both relations are insensitive whether the state used is pure or mixed. We improve the uncertainty relations by introducing additional terms which measure the mixtureness of the state. For the momentum and position operators as conjugate observables and for the thermal state of quantum harmonic oscillator, it turns out that the equalities in the improved uncertainty relations hold

  20. Assessment of Thermal Maturity Trends in Devonian–Mississippian Source Rocks Using Raman Spectroscopy: Limitations of Peak-Fitting Method

    Energy Technology Data Exchange (ETDEWEB)

    Lupoi, Jason S., E-mail: jlupoi@rjlg.com; Fritz, Luke P. [RJ Lee Group, Inc., Monroeville, PA (United States); Parris, Thomas M. [Kentucky Geological Survey, University of Kentucky, Lexington, KY (United States); Hackley, Paul C. [UniversityS. Geological Survey, Reston, VA (United States); Solotky, Logan [RJ Lee Group, Inc., Monroeville, PA (United States); Eble, Cortland F. [Kentucky Geological Survey, University of Kentucky, Lexington, KY (United States); Schlaegle, Steve [RJ Lee Group, Inc., Monroeville, PA (United States)

    2017-09-27

    The thermal maturity of shale is often measured by vitrinite reflectance (VRo). VRo measurements for the Devonian–Mississippian black shale source rocks evaluated herein predicted thermal immaturity in areas where associated reservoir rocks are oil-producing. This limitation of the VRo method led to the current evaluation of Raman spectroscopy as a suitable alternative for developing correlations between thermal maturity and Raman spectra. In this study, Raman spectra of Devonian–Mississippian black shale source rocks were regressed against measured VRo or sample-depth. Attempts were made to develop quantitative correlations of thermal maturity. Using sample-depth as a proxy for thermal maturity is not without limitations as thermal maturity as a function of depth depends on thermal gradient, which can vary through time, subsidence rate, uplift, lack of uplift, and faulting. Correlations between Raman data and vitrinite reflectance or sample-depth were quantified by peak-fitting the spectra. Various peak-fitting procedures were evaluated to determine the effects of the number of peaks and maximum peak widths on correlations between spectral metrics and thermal maturity. Correlations between D-frequency, G-band full width at half maximum (FWHM), and band separation between the G- and D-peaks and thermal maturity provided some degree of linearity throughout most peak-fitting assessments; however, these correlations and those calculated from the G-frequency, D/G FWHM ratio, and D/G peak area ratio also revealed a strong dependence on peak-fitting processes. This dependency on spectral analysis techniques raises questions about the validity of peak-fitting, particularly given the amount of subjective analyst involvement necessary to reconstruct spectra. This research shows how user interpretation and extrapolation affected the comparability of different samples, the accuracy of generated trends, and therefore, the potential of the Raman spectral method to become an

  1. Assessment of Thermal Maturity Trends in Devonian–Mississippian Source Rocks Using Raman Spectroscopy: Limitations of Peak-Fitting Method

    International Nuclear Information System (INIS)

    Lupoi, Jason S.; Fritz, Luke P.; Parris, Thomas M.; Hackley, Paul C.; Solotky, Logan; Eble, Cortland F.; Schlaegle, Steve

    2017-01-01

    The thermal maturity of shale is often measured by vitrinite reflectance (VRo). VRo measurements for the Devonian–Mississippian black shale source rocks evaluated herein predicted thermal immaturity in areas where associated reservoir rocks are oil-producing. This limitation of the VRo method led to the current evaluation of Raman spectroscopy as a suitable alternative for developing correlations between thermal maturity and Raman spectra. In this study, Raman spectra of Devonian–Mississippian black shale source rocks were regressed against measured VRo or sample-depth. Attempts were made to develop quantitative correlations of thermal maturity. Using sample-depth as a proxy for thermal maturity is not without limitations as thermal maturity as a function of depth depends on thermal gradient, which can vary through time, subsidence rate, uplift, lack of uplift, and faulting. Correlations between Raman data and vitrinite reflectance or sample-depth were quantified by peak-fitting the spectra. Various peak-fitting procedures were evaluated to determine the effects of the number of peaks and maximum peak widths on correlations between spectral metrics and thermal maturity. Correlations between D-frequency, G-band full width at half maximum (FWHM), and band separation between the G- and D-peaks and thermal maturity provided some degree of linearity throughout most peak-fitting assessments; however, these correlations and those calculated from the G-frequency, D/G FWHM ratio, and D/G peak area ratio also revealed a strong dependence on peak-fitting processes. This dependency on spectral analysis techniques raises questions about the validity of peak-fitting, particularly given the amount of subjective analyst involvement necessary to reconstruct spectra. This research shows how user interpretation and extrapolation affected the comparability of different samples, the accuracy of generated trends, and therefore, the potential of the Raman spectral method to become an

  2. Assessment of Thermal Maturity Trends in Devonian–Mississippian Source Rocks Using Raman Spectroscopy: Limitations of Peak-Fitting Method

    Directory of Open Access Journals (Sweden)

    Jason S. Lupoi

    2017-09-01

    Full Text Available The thermal maturity of shale is often measured by vitrinite reflectance (VRo. VRo measurements for the Devonian–Mississippian black shale source rocks evaluated herein predicted thermal immaturity in areas where associated reservoir rocks are oil-producing. This limitation of the VRo method led to the current evaluation of Raman spectroscopy as a suitable alternative for developing correlations between thermal maturity and Raman spectra. In this study, Raman spectra of Devonian–Mississippian black shale source rocks were regressed against measured VRo or sample-depth. Attempts were made to develop quantitative correlations of thermal maturity. Using sample-depth as a proxy for thermal maturity is not without limitations as thermal maturity as a function of depth depends on thermal gradient, which can vary through time, subsidence rate, uplift, lack of uplift, and faulting. Correlations between Raman data and vitrinite reflectance or sample-depth were quantified by peak-fitting the spectra. Various peak-fitting procedures were evaluated to determine the effects of the number of peaks and maximum peak widths on correlations between spectral metrics and thermal maturity. Correlations between D-frequency, G-band full width at half maximum (FWHM, and band separation between the G- and D-peaks and thermal maturity provided some degree of linearity throughout most peak-fitting assessments; however, these correlations and those calculated from the G-frequency, D/G FWHM ratio, and D/G peak area ratio also revealed a strong dependence on peak-fitting processes. This dependency on spectral analysis techniques raises questions about the validity of peak-fitting, particularly given the amount of subjective analyst involvement necessary to reconstruct spectra. This research shows how user interpretation and extrapolation affected the comparability of different samples, the accuracy of generated trends, and therefore, the potential of the Raman spectral

  3. Mapping Soil Transmitted Helminths and Schistosomiasis under Uncertainty: A Systematic Review and Critical Appraisal of Evidence.

    Directory of Open Access Journals (Sweden)

    Andrea L Araujo Navas

    2016-12-01

    Full Text Available Spatial modelling of STH and schistosomiasis epidemiology is now commonplace. Spatial epidemiological studies help inform decisions regarding the number of people at risk as well as the geographic areas that need to be targeted with mass drug administration; however, limited attention has been given to propagated uncertainties, their interpretation, and consequences for the mapped values. Using currently published literature on the spatial epidemiology of helminth infections we identified: (1 the main uncertainty sources, their definition and quantification and (2 how uncertainty is informative for STH programme managers and scientists working in this domain.We performed a systematic literature search using the Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA protocol. We searched Web of Knowledge and PubMed using a combination of uncertainty, geographic and disease terms. A total of 73 papers fulfilled the inclusion criteria for the systematic review. Only 9% of the studies did not address any element of uncertainty, while 91% of studies quantified uncertainty in the predicted morbidity indicators and 23% of studies mapped it. In addition, 57% of the studies quantified uncertainty in the regression coefficients but only 7% incorporated it in the regression response variable (morbidity indicator. Fifty percent of the studies discussed uncertainty in the covariates but did not quantify it. Uncertainty was mostly defined as precision, and quantified using credible intervals by means of Bayesian approaches.None of the studies considered adequately all sources of uncertainties. We highlighted the need for uncertainty in the morbidity indicator and predictor variable to be incorporated into the modelling framework. Study design and spatial support require further attention and uncertainty associated with Earth observation data should be quantified. Finally, more attention should be given to mapping and interpreting

  4. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    Science.gov (United States)

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  5. What limits working memory capacity? Evidence for modality-specific sources to the simultaneous storage of visual and auditory arrays.

    Science.gov (United States)

    Fougnie, Daryl; Marois, René

    2011-11-01

    There is considerable debate on whether working memory (WM) storage is mediated by distinct subsystems for auditory and visual stimuli (Baddeley, 1986) or whether it is constrained by a single, central capacity-limited system (Cowan, 2006). Recent studies have addressed this issue by measuring the dual-task cost during the concurrent storage of auditory and visual arrays (e.g., Cocchini, Logie, Della Sala, MacPherson, & Baddeley, 2002; Fougnie & Marois, 2006; Saults & Cowan, 2007). However, studies have yielded widely different dual-task costs, which have been taken to support both modality-specific and central capacity-limit accounts of WM storage. Here, we demonstrate that the controversies regarding such costs mostly stem from how these costs are measured. Measures that compare combined dual-task capacity with the higher single-task capacity support a single, central WM store when there is a large disparity between the single-task capacities (Experiment 1) but not when the single-task capacities are well equated (Experiment 2). In contrast, measures of the dual-task cost that normalize for differences in single-task capacity reveal evidence for modality-specific stores, regardless of single-task performance. Moreover, these normalized measures indicate that dual-task cost is much smaller if the tasks do not involve maintaining bound feature representations in WM (Experiment 3). Taken together, these experiments not only resolve a discrepancy in the field and clarify how to assess the dual-task cost but also indicate that WM capacity can be constrained both by modality-specific and modality-independent sources of information processing.

  6. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  7. Climate change impact on streamflow in large-scale river basins: projections and their uncertainties sourced from GCMs and RCP scenarios

    Science.gov (United States)

    Nasonova, Olga N.; Gusev, Yeugeniy M.; Kovalev, Evgeny E.; Ayzel, Georgy V.

    2018-06-01

    Climate change impact on river runoff was investigated within the framework of the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP2) using a physically-based land surface model Soil Water - Atmosphere - Plants (SWAP) (developed in the Institute of Water Problems of the Russian Academy of Sciences) and meteorological projections (for 2006-2099) simulated by five General Circulation Models (GCMs) (including GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, and NorESM1-M) for each of four Representative Concentration Pathway (RCP) scenarios (RCP2.6, RCP4.5, RCP6.0, and RCP8.5). Eleven large-scale river basins were used in this study. First of all, SWAP was calibrated and validated against monthly values of measured river runoff with making use of forcing data from the WATCH data set and all GCMs' projections were bias-corrected to the WATCH. Then, for each basin, 20 projections of possible changes in river runoff during the 21st century were simulated by SWAP. Analysis of the obtained hydrological projections allowed us to estimate their uncertainties resulted from application of different GCMs and RCP scenarios. On the average, the contribution of different GCMs to the uncertainty of the projected river runoff is nearly twice larger than the contribution of RCP scenarios. At the same time the contribution of GCMs slightly decreases with time.

  8. Computing the Risk of Postprandial Hypo- and Hyperglycemia in Type 1 Diabetes Mellitus Considering Intrapatient Variability and Other Sources of Uncertainty

    Science.gov (United States)

    García-Jaramillo, Maira; Calm, Remei; Bondia, Jorge; Tarín, Cristina; Vehí, Josep

    2009-01-01

    Objective The objective of this article was to develop a methodology to quantify the risk of suffering different grades of hypo- and hyperglycemia episodes in the postprandial state. Methods Interval predictions of patient postprandial glucose were performed during a 5-hour period after a meal for a set of 3315 scenarios. Uncertainty in the patient's insulin sensitivities and carbohydrate (CHO) contents of the planned meal was considered. A normalized area under the curve of the worst-case predicted glucose excursion for severe and mild hypo- and hyperglycemia glucose ranges was obtained and weighted accordingly to their importance. As a result, a comprehensive risk measure was obtained. A reference model of preprandial glucose values representing the behavior in different ranges was chosen by a ξ2 test. The relationship between the computed risk index and the probability of occurrence of events was analyzed for these reference models through 19,500 Monte Carlo simulations. Results The obtained reference models for each preprandial glucose range were 100, 160, and 220 mg/dl. A relationship between the risk index ranges 120 and the probability of occurrence of mild and severe postprandial hyper- and hypoglycemia can be derived. Conclusions When intrapatient variability and uncertainty in the CHO content of the meal are considered, a safer prediction of possible hyper- and hypoglycemia episodes induced by the tested insulin therapy can be calculated. PMID:20144339

  9. Stereo-particle image velocimetry uncertainty quantification

    International Nuclear Information System (INIS)

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  10. Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer

    Science.gov (United States)

    Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain

    2015-09-01

    Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.

  11. Orientation and uncertainties

    International Nuclear Information System (INIS)

    Peters, H.P.; Hennen, L.

    1990-01-01

    The authors report on the results of three representative surveys that made a closer inquiry into perceptions and valuations of information and information sources concering Chernobyl. If turns out that the information sources are generally considered little trustworthy. This was generally attributable to the interpretation of the events being tied to attitudes in the atmonic energy issue. The greatest credit was given to television broadcasting. The authors summarize their discourse as follows: There is good reason to interpret the widespread uncertainty after Chernobyl as proof of the fact that large parts of the population are prepared and willing to assume a critical stance towards information and prefer to draw their information from various sources representing different positions. (orig.) [de

  12. UNCERTAINTIES IN GALACTIC CHEMICAL EVOLUTION MODELS

    International Nuclear Information System (INIS)

    Côté, Benoit; Ritter, Christian; Herwig, Falk; O’Shea, Brian W.; Pignatari, Marco; Jones, Samuel; Fryer, Chris L.

    2016-01-01

    We use a simple one-zone galactic chemical evolution model to quantify the uncertainties generated by the input parameters in numerical predictions for a galaxy with properties similar to those of the Milky Way. We compiled several studies from the literature to gather the current constraints for our simulations regarding the typical value and uncertainty of the following seven basic parameters: the lower and upper mass limits of the stellar initial mass function (IMF), the slope of the high-mass end of the stellar IMF, the slope of the delay-time distribution function of Type Ia supernovae (SNe Ia), the number of SNe Ia per M ⊙ formed, the total stellar mass formed, and the final mass of gas. We derived a probability distribution function to express the range of likely values for every parameter, which were then included in a Monte Carlo code to run several hundred simulations with randomly selected input parameters. This approach enables us to analyze the predicted chemical evolution of 16 elements in a statistical manner by identifying the most probable solutions, along with their 68% and 95% confidence levels. Our results show that the overall uncertainties are shaped by several input parameters that individually contribute at different metallicities, and thus at different galactic ages. The level of uncertainty then depends on the metallicity and is different from one element to another. Among the seven input parameters considered in this work, the slope of the IMF and the number of SNe Ia are currently the two main sources of uncertainty. The thicknesses of the uncertainty bands bounded by the 68% and 95% confidence levels are generally within 0.3 and 0.6 dex, respectively. When looking at the evolution of individual elements as a function of galactic age instead of metallicity, those same thicknesses range from 0.1 to 0.6 dex for the 68% confidence levels and from 0.3 to 1.0 dex for the 95% confidence levels. The uncertainty in our chemical evolution model

  13. Some remarks on modeling uncertainties

    International Nuclear Information System (INIS)

    Ronen, Y.

    1983-01-01

    Several topics related to the question of modeling uncertainties are considered. The first topic is related to the use of the generalized bias operator method for modeling uncertainties. The method is expanded to a more general form of operators. The generalized bias operator is also used in the inverse problem and applied to determine the anisotropic scattering law. The last topic discussed is related to the question of the limit to accuracy and how to establish its value. (orig.) [de

  14. Limit of detection of a fiber optics gyroscope using a super luminescent radiation source; Limite de deteccion de un giroscopio de fibra optica usando una fuente de radiacion superluminiscente

    Energy Technology Data Exchange (ETDEWEB)

    Sandoval R, G.E. [Laboratorio de Optica Aplicada, Centro de Ciencias Aplicadas y Desarrollo Tecnologico, Universidad Nacional Autonoma de Mexico, Apartado Postal 70-186, 04510 Mexico D.F. (Mexico); Nikolaev, V.A. [Departamento de Optica y Radiofisica Cuantica, Universidad Estatal de Telecomunicaciones de San Petersburgo, M.A. Bonch-Bruyevich, Kanal Moika 61, Saint Petersburg 191186, (Russian Federation)

    2003-07-01

    The main objective of this work is to establish the dependence of characteristics of the fiber optics gyroscope (FOG) with respect to the parameters of the super luminescent emission source based on doped optical fiber with rare earth elements (Super luminescent Fiber Source, SFS), argument the pumping rate election of the SFS to obtain characteristics limits of the FOG sensibility. By using this type of emission source in the FOG is recommend to use the rate when the direction of the pumping signal coincide with the super luminescent signal. The most results are the proposition and argumentation of the SFS election as emission source to be use in the FOG of the phase type. Such a decision allow to increase the characteristics of the FOG sensibility in comparison with the use of luminescent source of semiconductors emission which are extensively used in the present time. The use of emission source of the SFS type allow to come closer to the threshold of the obtained sensibility limit (detection limit) which is determined with the shot noise. (Author)

  15. 40 CFR Table 3 to Subpart Wwww of... - Organic HAP Emissions Limits for Existing Open Molding Sources, New Open Molding Sources Emitting...

    Science.gov (United States)

    2010-07-01

    .../ton.4 Use the appropriate open molding emission limit.5 9. pultrusion 6 N/A reduce total organic HAP... CATEGORIES National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites...: If your operation type is . . . And you use . . . 1 Your organic HAP emissions limit is . . . 1. open...

  16. Impact of dose-distribution uncertainties on rectal ntcp modeling I: Uncertainty estimates

    International Nuclear Information System (INIS)

    Fenwick, John D.; Nahum, Alan E.

    2001-01-01

    A trial of nonescalated conformal versus conventional radiotherapy treatment of prostate cancer has been carried out at the Royal Marsden NHS Trust (RMH) and Institute of Cancer Research (ICR), demonstrating a significant reduction in the rate of rectal bleeding reported for patients treated using the conformal technique. The relationship between planned rectal dose-distributions and incidences of bleeding has been analyzed, showing that the rate of bleeding falls significantly as the extent of the rectal wall receiving a planned dose-level of more than 57 Gy is reduced. Dose-distributions delivered to the rectal wall over the course of radiotherapy treatment inevitably differ from planned distributions, due to sources of uncertainty such as patient setup error, rectal wall movement and variation in the absolute rectal wall surface area. In this paper estimates of the differences between planned and treated rectal dose-distribution parameters are obtained for the RMH/ICR nonescalated conformal technique, working from a distribution of setup errors observed during the RMH/ICR trial, movement data supplied by Lebesque and colleagues derived from repeat CT scans, and estimates of rectal circumference variations extracted from the literature. Setup errors and wall movement are found to cause only limited systematic differences between mean treated and planned rectal dose-distribution parameter values, but introduce considerable uncertainties into the treated values of some dose-distribution parameters: setup errors lead to 22% and 9% relative uncertainties in the highly dosed fraction of the rectal wall and the wall average dose, respectively, with wall movement leading to 21% and 9% relative uncertainties. Estimates obtained from the literature of the uncertainty in the absolute surface area of the distensible rectal wall are of the order of 13%-18%. In a subsequent paper the impact of these uncertainties on analyses of the relationship between incidences of bleeding

  17. Role of uncertainty in the basalt waste isolation project

    International Nuclear Information System (INIS)

    Knepp, A.J.; Dahlem, D.H.

    1989-01-01

    The current national Civilian Radioactive Waste Management (CRWM) Program to select a mined geologic repository will likely require the extensive use of probabilistic techniques to quantify uncertainty in predictions of repository isolation performance. Performance of nonhomogeneous, geologic hydrologic, and chemical systems must be predicted over time frames of thousands of years and therefore will likely contain significant uncertainty. A qualitative assessment of our limited ability to interrogate the site in a nondestructive manner coupled with the early stage of development in the pertinent geosciences support this statement. The success of the approach to incorporate what currently appears to be an appreciable element of uncertainty into the predictions of repository performance will play an important role in acquiring a license to operate and in establishing the level of safety associated with the concept of long-term geologic storage of nuclear waste. This paper presents a brief background on the Hanford Site and the repository program, references the sources that establish the legislative requirement to quantify uncertainties in performance predictions, and summarized the present and future program at the Hanford Site in this area. The decision to quantify significant sources of uncertainties has had a major impact on the direction of the site characterization program here at Hanford. The paper concludes with a number of observations on the impacts of this decision

  18. Risk uncertainty analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  19. Dealing with uncertainties in the safety of geological disposal of radioactive waste

    International Nuclear Information System (INIS)

    Devillers, Ch.

    2002-01-01

    Confidence in the safety assessment of a possible project of radioactive waste geological repository will only be obtained if the development of the project is closely guided by transparent safety strategies, acknowledging uncertainties and striving for limiting their effects. This paper highlights some sources of uncertainties, external or internal to the project, which are of particular importance for safety. It suggests safety strategies adapted to the uncertainties considered. The case of a possible repository project in the Callovo-Oxfordian clay layer of the French Bure site is examined from that point of view. The German project at Gorleben and the Swedish KBS-3 project are also briefly examined. (author)

  20. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  1. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  2. Limitations On The Creation of Continuously Surfable Waves Generated By A Pressure Source Moving In A Circular Path

    NARCIS (Netherlands)

    Schmied, S.A.

    2014-01-01

    The aim of the research presented in this work was to investigate the novel idea to produce continuous breaking waves, whereby a pressure source was rotated within an annular wave pool. The concept was that the pressure source generates non-breaking waves that propagate inward to the inner ring of

  3. Treatment of uncertainty in low-level waste performance assessment

    International Nuclear Information System (INIS)

    Kozak, M.W.; Olague, N.E.; Gallegos, D.P.; Rao, R.R.

    1991-01-01

    Uncertainties arise from a number of different sources in low-level waste performance assessment. In this paper the types of uncertainty are reviewed, and existing methods for quantifying and reducing each type of uncertainty are discussed. These approaches are examined in the context of the current low-level radioactive waste regulatory performance objectives, which are deterministic. The types of uncertainty discussed in this paper are model uncertainty, uncertainty about future conditions, and parameter uncertainty. The advantages and disadvantages of available methods for addressing uncertainty in low-level waste performance assessment are presented. 25 refs

  4. Entry and exit decisions under uncertainty

    DEFF Research Database (Denmark)

    Kongsted, Hans Christian

    1996-01-01

    This paper establishes the general deterministic limit that corresponds to Dixit's model of entry and exit decisions under uncertainty. The interlinked nature of decisions is shown to be essential also in the deterministic limit. A numerical example illustrates the result......This paper establishes the general deterministic limit that corresponds to Dixit's model of entry and exit decisions under uncertainty. The interlinked nature of decisions is shown to be essential also in the deterministic limit. A numerical example illustrates the result...

  5. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    International Nuclear Information System (INIS)

    Kirchner, G.; Peterson, R.

    1996-11-01

    variation between the best estimate predictions of the group. The assumptions of the users result in more uncertainty in the predictions (taking into account the 95% confidence intervals) than is shown by the confidence interval on the predictions of one user. Mistakes, being examples of incorrect user assumptions, cannot be ignored and must be accepted as contributing to the variability seen in the spread of predictions. The user's confidence in his/her understanding of a scenario description and/or confidence in working with a code does not necessarily mean that the predictions will be more accurate. Choice of parameter values contributed most to user-induced uncertainty followed by scenario interpretation. The contribution due to code implementation was low, but may have been limited due to the decision of the majority of the group not to submit predictions using the most complex of the three codes. Most modelers had difficulty adapting the models for certain expected output. Parameter values for wet and dry deposition, transfer from forage to milk and concentration ratios were mostly taken from the extensive database of Chernobyl fallout radionuclides, no matter what the scenario. Examples provided in the code manuals may influence code users considerably when preparing their own input files. A major problem concerns pasture concentrations given in fresh or dry weight: parameter values in codes have to be based on one or the other and the request for predictions in the scenario description may or may not be the same unit. This is a surprisingly common source of error. Most of the predictions showed order of magnitude discrepancies when best estimates are compared with the observations, although the participants had a highly professional background in radioecology and a good understanding of the importance of the processes modelled. When uncertainties are considered, however, mostly there was overlap between predictions and observations. A failure to reproduce the time

  6. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Kirchner, G. [Univ. of Bremen (Germany); Peterson, R. [AECL, Chalk River, ON (Canada)] [and others

    1996-11-01

    variation between the best estimate predictions of the group. The assumptions of the users result in more uncertainty in the predictions (taking into account the 95% confidence intervals) than is shown by the confidence interval on the predictions of one user. Mistakes, being examples of incorrect user assumptions, cannot be ignored and must be accepted as contributing to the variability seen in the spread of predictions. The user's confidence in his/her understanding of a scenario description and/or confidence in working with a code does not necessarily mean that the predictions will be more accurate. Choice of parameter values contributed most to user-induced uncertainty followed by scenario interpretation. The contribution due to code implementation was low, but may have been limited due to the decision of the majority of the group not to submit predictions using the most complex of the three codes. Most modelers had difficulty adapting the models for certain expected output. Parameter values for wet and dry deposition, transfer from forage to milk and concentration ratios were mostly taken from the extensive database of Chernobyl fallout radionuclides, no matter what the scenario. Examples provided in the code manuals may influence code users considerably when preparing their own input files. A major problem concerns pasture concentrations given in fresh or dry weight: parameter values in codes have to be based on one or the other and the request for predictions in the scenario description may or may not be the same unit. This is a surprisingly common source of error. Most of the predictions showed order of magnitude discrepancies when best estimates are compared with the observations, although the participants had a highly professional background in radioecology and a good understanding of the importance of the processes modelled. When uncertainties are considered, however, mostly there was overlap between predictions and observations. A failure to reproduce the

  7. Uncertainty assessment of source attribution of PM(2.5) and its water-soluble organic carbon content using different biomass burning tracers in positive matrix factorization analysis--a case study in Beijing, China.

    Science.gov (United States)

    Tao, Jun; Zhang, Leiming; Zhang, Renjian; Wu, Yunfei; Zhang, Zhisheng; Zhang, Xiaoling; Tang, Yixi; Cao, Junji; Zhang, Yuanhang

    2016-02-01

    Daily PM2.5 samples were collected at an urban site in Beijing during four one-month periods in 2009-2010, with each period in a different season. Samples were subject to chemical analysis for various chemical components including major water-soluble ions, organic carbon (OC) and water-soluble organic carbon (WSOC), element carbon (EC), trace elements, anhydrosugar levoglucosan (LG), and mannosan (MN). Three sets of source profiles of PM2.5 were first identified through positive matrix factorization (PMF) analysis using single or combined biomass tracers - non-sea salt potassium (nss-K(+)), LG, and a combination of nss-K(+) and LG. The six major source factors of PM2.5 included secondary inorganic aerosol, industrial pollution, soil dust, biomass burning, traffic emission, and coal burning, which were estimated to contribute 31±37%, 39±28%, 14±14%, 7±7%, 5±6%, and 4±8%, respectively, to PM2.5 mass if using the nss-K(+) source profiles, 22±19%, 29±17%, 20±20%, 13±13%, 12±10%, and 4±6%, respectively, if using the LG source profiles, and 21±17%, 31±18%, 19±19%, 11±12%, 14±11%, and 4±6%, respectively, if using the combined nss-K(+) and LG source profiles. The uncertainties in the estimation of biomass burning contributions to WSOC due to the different choices of biomass burning tracers were around 3% annually and up to 24% seasonally in terms of absolute percentage contributions, or on a factor of 1.7 annually and up to a factor of 3.3 seasonally in terms of the actual concentrations. The uncertainty from the major source (e.g. industrial pollution) was on a factor of 1.9 annually and up to a factor of 2.5 seasonally in the estimated WSOC concentrations. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Incorporating Forecast Uncertainty in Utility Control Center

    Energy Technology Data Exchange (ETDEWEB)

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian

    2014-07-09

    Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)

  9. Uncertainty and Decision Making: Examples of Some Possible New Frontiers

    Science.gov (United States)

    Silliman, S. E.; Rodak, C. M.; Bolster, D.; Saavedra, K.; Evans, W.

    2011-12-01

    The concept of decision making under uncertainty for groundwater systems represents an exciting area of research and application. In this presentation, three examples are briefly introduced which represent possible new applications of risk and decision making under uncertainty. In the most classic of the three examples, a probabilistic strategy is considered within the context of management / assessment of proposed changes in land-use in the vicinity of a public water-supply well. Focused on health-risk related to contamination at the well, the analysis includes uncertainties in source location / strength, groundwater flow / transport, human exposure, and human health risk. The second example involves application of Probabilistic Risk Assessment (PRA) to the evaluation of development projects in rural regions of developing countries. PRA combined with Fault Tree Analysis provides a structure for analysis of the impact of data uncertainties on the estimation of health risk resulting from failure of multiple components of new water-resource systems. The third is an extension of the concept of "risk compensation" to the analysis of potential long-term risk associated with new water resource projects. Of direct interest here is the appearance of new risk to the public, such as introduction of new disease pathways or new sources of contamination of the source waters. As a result of limitations on conceptual model and/or limitations on data, this type of risk is often difficult to identify / assess, and is therefore not commonly included in formal decision-making efforts: it may however seriously impact the long-term net benefit of a water resource project. The goal of presenting these three examples is to illustrate the breadth of possible application of uncertainty / risk analyses beyond the more classic applications to groundwater remediation and protection.

  10. The 'mini' and 'micro' energy sources and their limitations; Les ''minis'' et ''micros'' sources d'energie et leurs limites

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2004-03-01

    The 'Science and defense 2003' colloquium took place in Paris on December 2 and 3. It was organized by the general delegation of armament (DGA), the French atomic energy commission (CEA) with the participation of the national center of scientific research (CNRS) and the French ministry of research and new technologies. The main topic was the needs and the solutions implemented in the domain of energy sources both for civil and military portable applications. This article make a general summary of the content of this colloquium only in the domain of small power energy sources: emergence of new miniaturized portable devices (phones, computers, cameras, autonomous medical systems, wireless communication systems, soldiers equipments, drones etc..), improvement of batteries energy density and price, new needs linked with new functionalities, technological challenges (energy generation and recovery systems, miniaturized fuel-cells, use of recoverable energy sources, implantation of power sources in the human body etc..), new technological files (new electrode materials, micron- or sub-micron scale mechanical and electronic components, sensors, micro-motors, transfer of microelectronics technologies, hybrid systems..), international competition. (J.S.)

  11. The rise in the positron fraction. Distance limits on positron point sources from cosmic ray arrival directions and diffuse gamma-rays

    Energy Technology Data Exchange (ETDEWEB)

    Gebauer, Iris; Bentele, Rosemarie [Karlsruhe Institute of Technology, Karlsruhe (Germany)

    2016-07-01

    The rise in the positron fraction as observed by AMS and previously by PAMELA, cannot be explained by the standard paradigm of cosmic ray transport in which positrons are produced by cosmic-ray-gas interactions in the interstellar medium. Possible explanations are pulsars, which produce energetic electron-positron pairs in their rotating magnetic fields, or the annihilation of dark matter. Here we assume that these positrons originate from a single close-by point source, producing equal amounts of electrons and positrons. The propagation and energy losses of these electrons and positrons are calculated numerically using the DRAGON code, the source properties are optimized to best describe the AMS data. Using the FERMI-LAT limits on a possible dipole anisotropy in electron and positron arrival directions, we put a limit on the minimum distance of such a point source. The energy losses that these energetic electrons and positrons suffer on their way through the galaxy create gamma ray photons through bremsstrahlung and Inverse Compton scattering. Using the measurement of diffuse gamma rays from Fermi-LAT we put a limit on the maximum distance of such a point source. We find that a single electron positron point source powerful enough to explain the locally observed positron fraction must reside between 225 pc and 3.7 kpc distance from the sun and compare to known pulsars.

  12. 40 CFR Table 9 to Subpart Xxxx of... - Minimum Data for Continuous Compliance With the Emission Limits for Tire Production Affected Sources

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Minimum Data for Continuous Compliance With the Emission Limits for Tire Production Affected Sources 9 Table 9 to Subpart XXXX of Part 63... Hazardous Air Pollutants: Rubber Tire Manufacturing Pt. 63, Subpt. XXXX, Table 9 Table 9 to Subpart XXXX of...

  13. Determination of the Meteor Limiting Magnitude

    Science.gov (United States)

    Kingery, A.; Blaauw, R.; Cooke, W. J.

    2016-01-01

    The limiting meteor magnitude of a meteor camera system will depend on the camera hardware and software, sky conditions, and the location of the meteor radiant. Some of these factors are constants for a given meteor camera system, but many change between meteor shower or sporadic source and on both long and short timescales. Since the limiting meteor magnitude ultimately gets used to calculate the limiting meteor mass for a given data set, it is important to have an understanding of these factors and to monitor how they change throughout the night, as a 0.5 magnitude uncertainty in limiting magnitude translates to a uncertainty in limiting mass by a factor of two.

  14. On the uncertainty principle. V

    International Nuclear Information System (INIS)

    Halpern, O.

    1976-01-01

    The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)

  15. Seasonal changes in nutrient limitation and nitrate sources in the green macroalga Ulva lactuca at sites with and without green tides in a northeastern Pacific embayment.

    Science.gov (United States)

    Van Alstyne, Kathryn L

    2016-02-15

    In Penn Cove, ulvoid green algal mats occur annually. To examine seasonal variation in their causes, nitrogen and carbon were measured in Ulva lactuca in May, July, and September and stable nitrogen and oxygen isotope ratios were quantified in U. lactuca, Penn Cove seawater, upwelled water from Saratoga Passage, water near the Skagit River outflow, and effluents from wastewater treatment facilities. Ulvoid growth was nitrogen limited and the sources of nitrogen used by the algae changed during the growing season. Algal nitrogen concentrations were 0.85-4.55% and were highest in September and at sites where algae were abundant. Upwelled waters were the primary nitrogen source for the algae, but anthropogenic sources also contributed to algal growth towards the end of the growing season. This study suggests that small nitrogen inputs can result in crossing a "tipping point", causing the release of nutrient limitation and localized increases in algal growth. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Geological-structural models used in SR 97. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  17. Geological-structural models used in SR 97. Uncertainty analysis

    International Nuclear Information System (INIS)

    Saksa, P.; Nummela, J.

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the

  18. 40 CFR 414.91 - Toxic pollutant effluent limitations and standards for direct discharge point sources that use...

    Science.gov (United States)

    2010-07-01

    ..., production, and sampling and analysis information. Effluent characteristics Effluent limitations BAT and NSPS...-Dinitrophenol 123 71 2,4-Dinitrotoluene 285 113 2,6-Dinitrotoluene 641 255 Ethylbenzene 108 32 Fluoranthene 68...

  19. 40 CFR 414.101 - Toxic pollutant effluent limitations and standards for direct discharge point sources that do not...

    Science.gov (United States)

    2010-07-01

    ..., production, and sampling and analysis information. Effluent characteristics BAT effluent limitations and NSPS... phthalate 47 19 4,6-Dinitro-o-cresol 277 78 2,4-Dinitrophenol 4,291 1,207 Ethylbenzene 380 142 Fluoranthene...

  20. Source evaluation report phase 2 investigation: Limited field investigation. Final report: United States Air Force Environmental Restoration Program, Eielson Air Force Base, Alaska

    Energy Technology Data Exchange (ETDEWEB)

    1994-10-01

    This report describes the limited field investigation work done to address issues and answer unresolved questions regarding a collection of potential contaminant sources at Eielson Air Force Base (AFB), near Fairbanks, Alaska. These sources were listed in the Eielson AFB Federal Facility Agreement supporting the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) cleanup of the base. The limited field investigation began in 1993 to resolve all remaining technical issues and provide the data and analysis required to evaluate the environmental hazard associated with these sites. The objective of the limited field investigation was to allow the remedial project managers to sort each site into one of three categories: requiring remedial investigation/feasibility study, requiring interim removal action, or requiring no further remedial action.

  1. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  2. Needs of the CSAU uncertainty method

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  3. Investigation of uncertainty components in Coulomb blockade thermometry

    International Nuclear Information System (INIS)

    Hahtela, O. M.; Heinonen, M.; Manninen, A.; Meschke, M.; Savin, A.; Pekola, J. P.; Gunnarsson, D.; Prunnila, M.; Penttilä, J. S.; Roschier, L.

    2013-01-01

    Coulomb blockade thermometry (CBT) has proven to be a feasible method for primary thermometry in every day laboratory use at cryogenic temperatures from ca. 10 mK to a few tens of kelvins. The operation of CBT is based on single electron charging effects in normal metal tunnel junctions. In this paper, we discuss the typical error sources and uncertainty components that limit the present absolute accuracy of the CBT measurements to the level of about 1 % in the optimum temperature range. Identifying the influence of different uncertainty sources is a good starting point for improving the measurement accuracy to the level that would allow the CBT to be more widely used in high-precision low temperature metrological applications and for realizing thermodynamic temperature in accordance to the upcoming new definition of kelvin

  4. Investigation of uncertainty components in Coulomb blockade thermometry

    Energy Technology Data Exchange (ETDEWEB)

    Hahtela, O. M.; Heinonen, M.; Manninen, A. [MIKES Centre for Metrology and Accreditation, Tekniikantie 1, 02150 Espoo (Finland); Meschke, M.; Savin, A.; Pekola, J. P. [Low Temperature Laboratory, Aalto University, Tietotie 3, 02150 Espoo (Finland); Gunnarsson, D.; Prunnila, M. [VTT Technical Research Centre of Finland, Tietotie 3, 02150 Espoo (Finland); Penttilä, J. S.; Roschier, L. [Aivon Oy, Tietotie 3, 02150 Espoo (Finland)

    2013-09-11

    Coulomb blockade thermometry (CBT) has proven to be a feasible method for primary thermometry in every day laboratory use at cryogenic temperatures from ca. 10 mK to a few tens of kelvins. The operation of CBT is based on single electron charging effects in normal metal tunnel junctions. In this paper, we discuss the typical error sources and uncertainty components that limit the present absolute accuracy of the CBT measurements to the level of about 1 % in the optimum temperature range. Identifying the influence of different uncertainty sources is a good starting point for improving the measurement accuracy to the level that would allow the CBT to be more widely used in high-precision low temperature metrological applications and for realizing thermodynamic temperature in accordance to the upcoming new definition of kelvin.

  5. Denitrification on internal carbon sources in RAS is limited by fibers in fecal waste of rainbow trout

    NARCIS (Netherlands)

    Meriac, A.; Eding, E.H.; Kamstra, A.; Busscher, J.P.; Schrama, J.W.; Verreth, J.A.J.

    2014-01-01

    Denitrification on internal carbon sources offers the advantage to control nitrate levels in recirculating aquaculture systems (RAS) by using the fecal carbon produced within the husbandry system. However, it is not clear to which extent fecal carbon can be utilized by the microbial community within

  6. Absorption cooling sources atmospheric emissions decrease by implementation of simple algorithm for limiting temperature of cooling water

    Science.gov (United States)

    Wojdyga, Krzysztof; Malicki, Marcin

    2017-11-01

    Constant strive to improve the energy efficiency forces carrying out activities aimed at reduction of energy consumption hence decreasing amount of contamination emissions to atmosphere. Cooling demand, both for air-conditioning and process cooling, plays an increasingly important role in the balance of Polish electricity generation and distribution system in summer. During recent years' demand for electricity during summer months has been steadily and significantly increasing leading to deficits of energy availability during particularly hot periods. This causes growing importance and interest in trigeneration power generation sources and heat recovery systems producing chilled water. Key component of such system is thermally driven chiller, mostly absorption, based on lithium-bromide and water mixture. Absorption cooling systems also exist in Poland as stand-alone systems, supplied with heating from various sources, generated solely for them or recovered as waste or useless energy. The publication presents a simple algorithm, designed to reduce the amount of heat for the supply of absorption chillers producing chilled water for the purposes of air conditioning by reducing the temperature of the cooling water, and its impact on decreasing emissions of harmful substances into the atmosphere. Scale of environmental advantages has been rated for specific sources what enabled evaluation and estimation of simple algorithm implementation to sources existing nationally.

  7. Maximum entropy prior uncertainty and correlation of statistical economic data

    NARCIS (Netherlands)

    Dias, Rodriques J.F.

    2016-01-01

    Empirical estimates of source statistical economic data such as trade flows, greenhouse gas emissions or employment figures are always subject to uncertainty (stemming from measurement errors or confidentiality) but information concerning that uncertainty is often missing. This paper uses concepts

  8. WASH-1400: quantifying the uncertainties

    International Nuclear Information System (INIS)

    Erdmann, R.C.; Leverenz, F.L. Jr.; Lellouche, G.S.

    1981-01-01

    The purpose of this paper is to focus on the limitations of the WASH-1400 analysis in estimating the risk from light water reactors (LWRs). This assessment attempts to modify the quantification of the uncertainty in and estimate of risk as presented by the RSS (reactor safety study). 8 refs

  9. Wastewater treatment modelling: dealing with uncertainties

    DEFF Research Database (Denmark)

    Belia, E.; Amerlinck, Y.; Benedetti, L.

    2009-01-01

    This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...

  10. Use of uncertainty data in neutron dosimetry

    International Nuclear Information System (INIS)

    Greenwood, L.R.

    1980-01-01

    Uncertainty and covariance data are required for neutron activation cross sections and nuclear decay data used to adjust neutron flux spectra measured at accelerators and reactors. Covariances must be evaluated in order to assess errors in derived damage parameters, such as nuclear displacements. The primary sources of error are discussed along with needed improvements in presently available uncertainty data

  11. A Statistical Framework for Microbial Source Attribution: Measuring Uncertainty in Host Transmission Events Inferred from Genetic Data (Part 2 of a 2 Part Report)

    Energy Technology Data Exchange (ETDEWEB)

    Allen, J; Velsko, S

    2009-11-16

    This report explores the question of whether meaningful conclusions can be drawn regarding the transmission relationship between two microbial samples on the basis of differences observed between the two sample's respective genomes. Unlike similar forensic applications using human DNA, the rapid rate of microbial genome evolution combined with the dynamics of infectious disease require a shift in thinking on what it means for two samples to 'match' in support of a forensic hypothesis. Previous outbreaks for SARS-CoV, FMDV and HIV were examined to investigate the question of how microbial sequence data can be used to draw inferences that link two infected individuals by direct transmission. The results are counter intuitive with respect to human DNA forensic applications in that some genetic change rather than exact matching improve confidence in inferring direct transmission links, however, too much genetic change poses challenges, which can weaken confidence in inferred links. High rates of infection coupled with relatively weak selective pressure observed in the SARS-CoV and FMDV data lead to fairly low confidence for direct transmission links. Confidence values for forensic hypotheses increased when testing for the possibility that samples are separated by at most a few intermediate hosts. Moreover, the observed outbreak conditions support the potential to provide high confidence values for hypothesis that exclude direct transmission links. Transmission inferences are based on the total number of observed or inferred genetic changes separating two sequences rather than uniquely weighing the importance of any one genetic mismatch. Thus, inferences are surprisingly robust in the presence of sequencing errors provided the error rates are randomly distributed across all samples in the reference outbreak database and the novel sequence samples in question. When the number of observed nucleotide mutations are limited due to characteristics of the

  12. Incorporating uncertainty analysis into life cycle estimates of greenhouse gas emissions from biomass production

    International Nuclear Information System (INIS)

    Johnson, David R.; Willis, Henry H.; Curtright, Aimee E.; Samaras, Constantine; Skone, Timothy

    2011-01-01

    Before further investments are made in utilizing biomass as a source of renewable energy, both policy makers and the energy industry need estimates of the net greenhouse gas (GHG) reductions expected from substituting biobased fuels for fossil fuels. Such GHG reductions depend greatly on how the biomass is cultivated, transported, processed, and converted into fuel or electricity. Any policy aiming to reduce GHGs with biomass-based energy must account for uncertainties in emissions at each stage of production, or else it risks yielding marginal reductions, if any, while potentially imposing great costs. This paper provides a framework for incorporating uncertainty analysis specifically into estimates of the life cycle GHG emissions from the production of biomass. We outline the sources of uncertainty, discuss the implications of uncertainty and variability on the limits of life cycle assessment (LCA) models, and provide a guide for practitioners to best practices in modeling these uncertainties. The suite of techniques described herein can be used to improve the understanding and the representation of the uncertainties associated with emissions estimates, thus enabling improved decision making with respect to the use of biomass for energy and fuel production. -- Highlights: → We describe key model, scenario and data uncertainties in LCAs of biobased fuels. → System boundaries and allocation choices should be consistent with study goals. → Scenarios should be designed around policy levers that can be controlled. → We describe a new way to analyze the importance of covariance between inputs.

  13. Limit to mass sensitivity of nanoresonators with random rough surfaces due to intrinsic sources and interactions with the surrounding gas

    NARCIS (Netherlands)

    Palasantzas, G.

    2008-01-01

    We investigate initially the influence of thermomechanical and momentum exchange noise on the limit to mass sensitivity Delta m of nanoresonators with random rough surfaces, which are characterized by the roughness amplitude w, the correlation length xi, and the roughness exponent 0

  14. What Limits Working Memory Capacity? Evidence for Modality-Specific Sources to the Simultaneous Storage of Visual and Auditory Arrays

    Science.gov (United States)

    Fougnie, Daryl; Marois, Rene

    2011-01-01

    There is considerable debate on whether working memory (WM) storage is mediated by distinct subsystems for auditory and visual stimuli (Baddeley, 1986) or whether it is constrained by a single, central capacity-limited system (Cowan, 2006). Recent studies have addressed this issue by measuring the dual-task cost during the concurrent storage of…

  15. Simulation codes and the impact of validation/uncertainty requirements

    International Nuclear Information System (INIS)

    Sills, H.E.

    1995-01-01

    Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)

  16. Model-specification uncertainty in future forest pest outbreak.

    Science.gov (United States)

    Boulanger, Yan; Gray, David R; Cooke, Barry J; De Grandpré, Louis

    2016-04-01

    Climate change will modify forest pest outbreak characteristics, although there are disagreements regarding the specifics of these changes. A large part of this variability may be attributed to model specifications. As a case study, we developed a consensus model predicting spruce budworm (SBW, Choristoneura fumiferana [Clem.]) outbreak duration using two different predictor data sets and six different correlative methods. The model was used to project outbreak duration and the uncertainty associated with using different data sets and correlative methods (=model-specification uncertainty) for 2011-2040, 2041-2070 and 2071-2100, according to three forcing scenarios (RCP 2.6, RCP 4.5 and RCP 8.5). The consensus model showed very high explanatory power and low bias. The model projected a more important northward shift and decrease in outbreak duration under the RCP 8.5 scenario. However, variation in single-model projections increases with time, making future projections highly uncertain. Notably, the magnitude of the shifts in northward expansion, overall outbreak duration and the patterns of outbreaks duration at the southern edge were highly variable according to the predictor data set and correlative method used. We also demonstrated that variation in forcing scenarios contributed only slightly to the uncertainty of model projections compared with the two sources of model-specification uncertainty. Our approach helped to quantify model-specification uncertainty in future forest pest outbreak characteristics. It may contribute to sounder decision-making by acknowledging the limits of the projections and help to identify areas where model-specification uncertainty is high. As such, we further stress that this uncertainty should be strongly considered when making forest management plans, notably by adopting adaptive management strategies so as to reduce future risks. © 2015 Her Majesty the Queen in Right of Canada Global Change Biology © 2015 Published by John

  17. Decommissioning Funding: Ethics, Implementation, Uncertainties

    International Nuclear Information System (INIS)

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  18. The role of scientific uncertainty in compliance with the Kyoto Protocol to the Climate Change Convention

    International Nuclear Information System (INIS)

    Gupta, Joyeeta; Olsthoorn, Xander; Rotenberg, Edan

    2003-01-01

    Under the climate change treaties, developed countries are under a quantitative obligation to limit their emissions of greenhouse gases (GHG). This paper argues that although the climate change regime is setting up various measures and mechanisms, there will still be significant uncertainty about the actual emission reductions and the effectiveness of the regime will depend largely on how countries actually implement their obligations in practice. These uncertainties arise from the calculation of emissions from each source, the tallying up these emissions, adding or deducting changes due to land use change and forestry (LUCF) and finally from subtracting or adding emission reduction units (ERUs). Further, it points to the problem of uncertainty in the reductions as opposed to the uncertainty in the inventories themselves. The protocols have temporarily opted to deal with these problems through harmonisation in reporting methodologies and to seek transparency by calling on parties involved to use specific guidelines and to report on their uncertainty. This paper concludes that this harmonisation of reporting methodologies does not account for regional differences and that while transparency will indicate when countries are adopting strategies that have high uncertainty; it will not help to increase the effectiveness of the protocol. Uncertainty about compliance then becomes a critical issue. This paper proposes to reduce this uncertainty in compliance by setting a minimum requirement for the probability of compliance

  19. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    Science.gov (United States)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  20. Formation of universal and diffusion regions of non-linear spectra of relativistic electrons in spatially limited sources

    International Nuclear Information System (INIS)

    Kontorovich, V.M.; Kochanov, A.E.

    1980-01-01

    It is demonstrated that in the case of hard injection of relativistic electrons accompanied by the joint action of synchrotron (Compton) losses and energy-dependent spatial diffusion, a spectrum with 'breaks' is formed containing universal (with index γ = 2) and diffusion regions, both independent of the injection spectrum. The effect from non-linearity of the electron spectrum is considered in averaged electromagnetic spectra for various geometries of sources (sphere, disk, arm). It is shown that an universal region (with index α = 0.5) can occur in the radiation spectrum. (orig.)

  1. Model uncertainties in top-quark physics

    CERN Document Server

    Seidel, Markus

    2014-01-01

    The ATLAS and CMS collaborations at the Large Hadron Collider (LHC) are studying the top quark in pp collisions at 7 and 8 TeV. Due to the large integrated luminosity, precision measurements of production cross-sections and properties are often limited by systematic uncertainties. An overview of the modeling uncertainties for simulated events is given in this report.

  2. Uncertainty Communication. Issues and good practice

    International Nuclear Information System (INIS)

    Kloprogge, P.; Van der Sluijs, J.; Wardekker, A.

    2007-12-01

    In 2003 the Netherlands Environmental Assessment Agency (MNP) published the RIVM/MNP Guidance for Uncertainty Assessment and Communication. The Guidance assists in dealing with uncertainty in environmental assessments. Dealing with uncertainty is essential because assessment results regarding complex environmental issues are of limited value if the uncertainties have not been taken into account adequately. A careful analysis of uncertainties in an environmental assessment is required, but even more important is the effective communication of these uncertainties in the presentation of assessment results. The Guidance yields rich and differentiated insights in uncertainty, but the relevance of this uncertainty information may vary across audiences and uses of assessment results. Therefore, the reporting of uncertainties is one of the six key issues that is addressed in the Guidance. In practice, users of the Guidance felt a need for more practical assistance in the reporting of uncertainty information. This report explores the issue of uncertainty communication in more detail, and contains more detailed guidance on the communication of uncertainty. In order to make this a 'stand alone' document several questions that are mentioned in the detailed Guidance have been repeated here. This document thus has some overlap with the detailed Guidance. Part 1 gives a general introduction to the issue of communicating uncertainty information. It offers guidelines for (fine)tuning the communication to the intended audiences and context of a report, discusses how readers of a report tend to handle uncertainty information, and ends with a list of criteria that uncertainty communication needs to meet to increase its effectiveness. Part 2 helps writers to analyze the context in which communication takes place, and helps to map the audiences, and their information needs. It further helps to reflect upon anticipated uses and possible impacts of the uncertainty information on the

  3. 40 CFR Table 1 to Subpart III of... - HAP ABA Formulation Limitations Matrix for New Sources [see § 63.1297(d)(2)

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true HAP ABA Formulation Limitations Matrix for New Sources [see § 63.1297(d)(2)] 1 Table 1 to Subpart III of Part 63 Protection of Environment... Flexible Polyurethane Foam Production Pt. 63, Subpt. III, Table 1 Table 1 to Subpart III of Part 63—HAP ABA...

  4. Seismic velocity uncertainties and their effect on geothermal predictions: A case study

    Science.gov (United States)

    Rabbel, Wolfgang; Köhn, Daniel; Bahadur Motra, Hem; Niederau, Jan; Thorwart, Martin; Wuttke, Frank; Descramble Working Group

    2017-04-01

    Geothermal exploration relies in large parts on geophysical subsurface models derived from seismic reflection profiling. These models are the framework of hydro-geothermal modeling, which further requires estimating thermal and hydraulic parameters to be attributed to the seismic strata. All petrophysical and structural properties involved in this process can be determined only with limited accuracy and thus impose uncertainties onto the resulting model predictions of temperature-depth profiles and hydraulic flow, too. In the present study we analyze sources and effects of uncertainties of the seismic velocity field, which translate directly into depth uncertainties of the hydraulically and thermally relevant horizons. Geological sources of these uncertainties are subsurface heterogeneity and seismic anisotropy, methodical sources are limitations in spread length and physical resolution. We demonstrate these effects using data of the EU-Horizon 2020 project DESCRAMBLE investigating a shallow super-critical geothermal reservoir in the Larderello area. The study is based on 2D- and 3D seismic reflection data and laboratory measurements on representative rock samples under simulated in-situ conditions. The rock samples consistently show P-wave anisotropy values of 10-20% order of magnitude. However, the uncertainty of layer depths induced by anisotropy is likely to be lower depending on the accuracy, with which the spatial orientation of bedding planes can be determined from the seismic reflection images.

  5. Correlated uncertainties in integral data

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1978-01-01

    The use of correlated uncertainties in calculational data is shown in cases investigated to lead to a reduction in the uncertainty of calculated quantities of importance to reactor design. It is stressed however that such reductions are likely to be important in a minority of cases of practical interest. The effect of uncertainties in detector cross-sections is considered and is seen to be, in some cases, of equal importance to that in the data used in calculations. Numerical investigations have been limited by the sparse information available on data correlations; some comparisons made of these data reveal quite large inconsistencies for both detector cross-sections and cross-section of interest for reactor calculations

  6. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  7. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1984-01-01

    . The statistical uncertainty -due to lack of information can e.g. be taken into account by describing the variables by predictive density functions, Veneziano [2). In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis...... is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...

  8. Ethnic disparities among food sources of energy and nutrients of public health concern and nutrients to limit in adults in the United States: NHANES 2003-2006.

    Science.gov (United States)

    O'Neil, Carol E; Nicklas, Theresa A; Keast, Debra R; Fulgoni, Victor L

    2014-01-01

    Identification of current food sources of energy and nutrients among US non-Hispanic whites (NHW), non-Hispanic blacks (NHB), and Mexican American (MA) adults is needed to help with public health efforts in implementing culturally sensitive and feasible dietary recommendations. The objective of this study was to determine the food sources of energy and nutrients to limit [saturated fatty acids (SFA), added sugars, and sodium] and nutrients of public health concern (dietary fiber, vitamin D, calcium, and potassium) by NHW, NHB, and MA adults. This was a cross-sectional analysis of a nationally representative sample of NWH (n=4,811), NHB (2,062), and MA (n=1,950) adults 19+ years. The 2003-2006 NHANES 24-h recall (Day 1) dietary intake data were analyzed. An updated USDA Dietary Source Nutrient Database was developed using current food composition databases. Food grouping included ingredients from disaggregated mixtures. Mean energy and nutrient intakes from food sources were sample-weighted. Percentages of total dietary intake contributed from food sources were ranked. Multiple differences in intake among ethnic groups were seen for energy and all nutrients examined. For example, energy intake was higher in MA as compared to NHB; SFA, added sugars, and sodium intakes were higher in NHW than NHB; dietary fiber was highest in MA and lowest in NHB; vitamin D was highest in NHW; calcium was lowest in NHB; and potassium was higher in NHW as compared to NHB. Food sources of these nutrients also varied. Identification of intake of nutrients to limit and of public health concern can help health professionals implement appropriate dietary recommendations and plan interventions that are ethnically appropriate.

  9. Summary of existing uncertainty methods

    International Nuclear Information System (INIS)

    Glaeser, Horst

    2013-01-01

    A summary of existing and most used uncertainty methods is presented, and the main features are compared. One of these methods is the order statistics method based on Wilks' formula. It is applied in safety research as well as in licensing. This method has been first proposed by GRS for use in deterministic safety analysis, and is now used by many organisations world-wide. Its advantage is that the number of potential uncertain input and output parameters is not limited to a small number. Such a limitation was necessary for the first demonstration of the Code Scaling Applicability Uncertainty Method (CSAU) by the United States Regulatory Commission (USNRC). They did not apply Wilks' formula in their statistical method propagating input uncertainties to obtain the uncertainty of a single output variable, like peak cladding temperature. A Phenomena Identification and Ranking Table (PIRT) was set up in order to limit the number of uncertain input parameters, and consequently, the number of calculations to be performed. Another purpose of such a PIRT process is to identify the most important physical phenomena which a computer code should be suitable to calculate. The validation of the code should be focused on the identified phenomena. Response surfaces are used in some applications replacing the computer code for performing a high number of calculations. The second well known uncertainty method is the Uncertainty Methodology Based on Accuracy Extrapolation (UMAE) and the follow-up method 'Code with the Capability of Internal Assessment of Uncertainty (CIAU)' developed by the University Pisa. Unlike the statistical approaches, the CIAU does compare experimental data with calculation results. It does not consider uncertain input parameters. Therefore, the CIAU is highly dependent on the experimental database. The accuracy gained from the comparison between experimental data and calculated results are extrapolated to obtain the uncertainty of the system code predictions

  10. Energy Limits in Second Generation High-pitch Dual Source CT - Comparison in an Upper Abdominal Phantom

    Directory of Open Access Journals (Sweden)

    Martin Beeres

    2015-01-01

    Full Text Available Objectives: The aim of our study was to find out how much energy is applicable in second-generation dual source high-pitch computed tomography (CT in imaging of the abdomen. Materials and Methods: We examined an upper abdominal phantom using a Somatom Definition Flash CT-Scanner (Siemens, Forchheim, Germany. The study protocol consisted of a scan-series at 100 kV and 120 kV. In each scan series we started with a pitch of 3.2 and reduced it in steps of 0.2, until a pitch of 1.6 was reached. The current was adjusted to the maximum the scanner could achieve. Energy values, image noise, image quality, and radiation exposure were evaluated. Results: For a pitch of 3.2 the maximum applicable current was 142 mAs at 120 kV and in 100 kV the maximum applicable current was 114 mAs. For conventional abdominal imaging, current levels of 200 to 260 mAs are generally used. To achieve similar current levels, we had to decrease the pitch to 1.8 at 100 kV - at this pitch we could perform our imaging at 204 mAs. At a pitch of 2.2 in 120 kV we could apply a current of 206 mAs. Conclusion: We conclude our study by stating that if there is a need for a higher current, we have to reduce the pitch. In a high-pitch dual source CT, we always have to remember where our main focus is, so we can adjust the pitch to the energy we need in the area of the body that has to be imaged, to find answers to the clinical question being raised.

  11. Extending the process limits of laser polymer welding with high-brilliance beam sources (recent status and prospects of POLYBRIGHT)

    Science.gov (United States)

    Olowinsky, A.; Boglea, A.

    2011-03-01

    Plastics play an important role in almost every facet of our lives and constitute a wide variety of products, from everyday products such as food and beverage packaging, over furniture and building materials to high tech products in the automotive, electronics, aerospace, white goods, medical and other sectors [1]. The objective of PolyBright, the European Research project on laser polymer welding, is to provide high speed and flexible laser manufacturing technology and expand the limits of current plastic part assembly. New laser polymer joining processes for optimized thermal management in combination with adapted wavelengths will provide higher quality, high processing speed up to 1 m/s and robust manufacturing processes at lower costs. Key innovations of the PolyBright project are fibre lasers with high powers up to 500 W, high speed scanning and flexible beam manipulation systems for simultaneous welding and high-resolution welding, such as dynamic masks and multi kHz scanning heads. With this initial step, PolyBright will break new paths in processing of advanced plastic products overcoming the quality and speed limitations of conventional plastic part assembly. Completely new concepts for high speed processing, flexibility and quality need to be established in combination with high brilliance lasers and related equipment. PolyBright will thus open new markets for laser systems with a short term potential of over several 100 laser installations per year and a future much larger market share in the still growing plastic market. PolyBright will hence establish a comprehensive and sustainable development activity on new high brilliance lasers that will strengthen the laser system industry.

  12. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    DEFF Research Database (Denmark)

    Thomsen, Nanna Isbak; Binning, Philip John; McKnight, Ursula S.

    2016-01-01

    the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found...... to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models...... that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert...

  13. Delayed plastic relaxation limit in SiGe islands grown by Ge diffusion from a local source

    Energy Technology Data Exchange (ETDEWEB)

    Vanacore, G. M.; Zani, M.; Tagliaferri, A., E-mail: alberto.tagliaferri@polimi.it [CNISM-Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo da Vinci 32, I-20133 Milano (Italy); Nicotra, G. [IMM-CNR, Stradale Primosole 50, I-95121 Catania (Italy); Bollani, M. [CNR-IFN, LNESS, Via Anzani 42, I-22100 Como (Italy); Bonera, E.; Montalenti, F.; Picco, A.; Boioli, F. [Dipartimento di Scienza dei Materiali and L-NESS, Università Milano-Bicocca, via Cozzi 53, I-20125 Milano (Italy); Capellini, G. [Department of Sciences at the Università Roma Tre, Via Vasca Navale 79, 00146 Roma (Italy); Isella, G. [CNISM, LNESS, Dipartimento di Fisica, Politecnico di Milano (Polo di Como), Via Anzani 42, I-22100 Como (Italy); Osmond, J. [ICFO–The Institute of Photonic Sciences, Av. Carl Friedrich Gauss, 3, E-08860 Castelldefels (Barcelona) (Spain)

    2015-03-14

    The hetero-epitaxial strain relaxation in nano-scale systems plays a fundamental role in shaping their properties. Here, the elastic and plastic relaxation of self-assembled SiGe islands grown by surface-thermal-diffusion from a local Ge solid source on Si(100) are studied by atomic force and transmission electron microscopies, enabling the simultaneous investigation of the strain relaxation in different dynamical regimes. Islands grown by this technique remain dislocation-free and preserve a structural coherence with the substrate for a base width as large as 350 nm. The results indicate that a delay of the plastic relaxation is promoted by an enhanced Si-Ge intermixing, induced by the surface-thermal-diffusion, which takes place already in the SiGe overlayer before the formation of a critical nucleus. The local entropy of mixing dominates, leading the system toward a thermodynamic equilibrium, where non-dislocated, shallow islands with a low residual stress are energetically stable. These findings elucidate the role of the interface dynamics in modulating the lattice distortion at the nano-scale, and highlight the potential use of our growth strategy to create composition and strain-controlled nano-structures for new-generation devices.

  14. Forecast Accuracy Uncertainty and Momentum

    OpenAIRE

    Bing Han; Dong Hong; Mitch Warachka

    2009-01-01

    We demonstrate that stock price momentum and earnings momentum can result from uncertainty surrounding the accuracy of cash flow forecasts. Our model has multiple information sources issuing cash flow forecasts for a stock. The investor combines these forecasts into an aggregate cash flow estimate that has minimal mean-squared forecast error. This aggregate estimate weights each cash flow forecast by the estimated accuracy of its issuer, which is obtained from their past forecast errors. Mome...

  15. LOFT differential pressure uncertainty analysis

    International Nuclear Information System (INIS)

    Evans, R.P.; Biladeau, G.L.; Quinn, P.A.

    1977-03-01

    A performance analysis of the LOFT differential pressure (ΔP) measurement is presented. Along with completed descriptions of test programs and theoretical studies that have been conducted on the ΔP, specific sources of measurement uncertainty are identified, quantified, and combined to provide an assessment of the ability of this measurement to satisfy the SDD 1.4.1C (June 1975) requirement of measurement of differential pressure

  16. Uncertainty and Cognitive Control

    Directory of Open Access Journals (Sweden)

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  17. Exploring uncertainty in the Earth Sciences - the potential field perspective

    Science.gov (United States)

    Saltus, R. W.; Blakely, R. J.

    2013-12-01

    Interpretation of gravity and magnetic anomalies is mathematically non-unique because multiple theoretical solutions are possible. The mathematical label of 'non-uniqueness' can lead to the erroneous impression that no single interpretation is better in a geologic sense than any other. The purpose of this talk is to present a practical perspective on the theoretical non-uniqueness of potential field interpretation in geology. There are multiple ways to approach and constrain potential field studies to produce significant, robust, and definitive results. For example, a smooth, bell-shaped gravity profile, in theory, could be caused by an infinite set of physical density bodies, ranging from a deep, compact, circular source to a shallow, smoothly varying, inverted bell-shaped source. In practice, however, we can use independent geologic or geophysical information to limit the range of possible source densities and rule out many of the theoretical solutions. We can further reduce the theoretical uncertainty by careful attention to subtle anomaly details. For example, short-wavelength anomalies are a well-known and theoretically established characteristic of shallow geologic sources. The 'non-uniqueness' of potential field studies is closely related to the more general topic of scientific uncertainty in the Earth sciences and beyond. Nearly all results in the Earth sciences are subject to significant uncertainty because problems are generally addressed with incomplete and imprecise data. The increasing need to combine results from multiple disciplines into integrated solutions in order to address complex global issues requires special attention to the appreciation and communication of uncertainty in geologic interpretation.

  18. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    uncertainties in CL and EX estimates were found to be efficiently mitigated by reducing data uncertainty in the critical limit of the chemical criteria - the BC/Al ratio. The distributed CL and EX assessment on local level in Sweden was found to be efficiently improved by enhancing the resolution of the underlying vegetation map 68 refs, 15 figs, 3 tabs

  19. Determining the parameters of Weibull function to estimate the wind power potential in conditions of limited source meteorological data

    Science.gov (United States)

    Fetisova, Yu. A.; Ermolenko, B. V.; Ermolenko, G. V.; Kiseleva, S. V.

    2017-04-01

    We studied the information basis for the assessment of wind power potential on the territory of Russia. We described the methodology to determine the parameters of the Weibull function, which reflects the density of distribution of probabilities of wind flow speeds at a defined basic height above the surface of the earth using the available data on the average speed at this height and its repetition by gradations. The application of the least square method for determining these parameters, unlike the use of graphical methods, allows performing a statistical assessment of the results of approximation of empirical histograms by the Weibull formula. On the basis of the computer-aided analysis of the statistical data, it was shown that, at a fixed point where the wind speed changes at different heights, the range of parameter variation of the Weibull distribution curve is relatively small, the sensitivity of the function to parameter changes is quite low, and the influence of changes on the shape of speed distribution curves is negligible. Taking this into consideration, we proposed and mathematically verified the methodology of determining the speed parameters of the Weibull function at other heights using the parameter computations for this function at a basic height, which is known or defined by the average speed of wind flow, or the roughness coefficient of the geological substrate. We gave examples of practical application of the suggested methodology in the development of the Atlas of Renewable Energy Resources in Russia in conditions of deficiency of source meteorological data. The proposed methodology, to some extent, may solve the problem related to the lack of information on the vertical profile of repeatability of the wind flow speeds in the presence of a wide assortment of wind turbines with different ranges of wind-wheel axis heights and various performance characteristics in the global market; as a result, this methodology can become a powerful tool for

  20. Methane and CO2 fluxes of moving point sources - Beyond or within the limits of eddy covariance measurements

    Science.gov (United States)

    Felber, Raphael; Neftel, Albrecht; Münger, Andreas; Ammann, Christof

    2014-05-01

    The eddy covariance (EC) technique has been extensively used for CO2 and energy exchange measurements over different ecosystems. For some years, it has been also becoming widely used to investigate CH4 and N2O exchange over ecosystems including grazing systems. EC measurements represent a spatially integrated flux over an upwind area (footprint). Whereas for extended homogenous areas EC measurements work well, the animals in a grazing system are a challenge as they represent moving point sources that create inhomogeneous conditions in space and time. The main issues which have to be taken into account when applying EC flux measurements over a grazed system are: i) In the presence of animals the high time resolution concentration measurements show large spikes in the signal. These spikes may be filtered/reduced by standard quality control software in order to avoid wrong measurements. ii) Data on the position of the animals relative to the flux footprint is needed to quantify the contribution of the grazing animals to the measured flux. For one grazing season we investigated the ability of EC flux measurements to reliably quantify the contribution of the grazing animals to the CH4 and CO2 exchange over pasture systems. For this purpose, a field experiment with a herd of twenty dairy cows in a full-day rotational grazing system was carried out on the Swiss central plateau. Net CH4 and CO2 exchange of the pasture system was measured continuously by the eddy covariance technique (Sonic Anemometer HS-50, Gill Instruments Ltd; FGGA, Los Gatos Research Inc.). To quantify the contribution of the animals to the net flux, the position of the individual cows was recorded using GPS (5 s time resolution) on each animal. An existing footprint calculation tool (ART footprint tool) was adapted and CH4 emissions of the cows were calculated. CH4 emissions from cows could be used as a tracer to investigate the quality of the evaluation of the EC data, since the background exchange of

  1. Computer code determination of tolerable accel current and voltage limits during startup of an 80 kV MFTF sustaining neutral beam source

    International Nuclear Information System (INIS)

    Mayhall, D.J.; Eckard, R.D.

    1979-01-01

    We have used a Lawrence Livermore Laboratory (LLL) version of the WOLF ion source extractor design computer code to determine tolerable accel current and voltage limits during startup of a prototype 80 kV Mirror Fusion Test Facility (MFTF) sustaining neutral beam source. Arc current limits are also estimated. The source extractor has gaps of 0.236, 0.721, and 0.155 cm. The effective ion mass is 2.77 AMU. The measured optimum accel current density is 0.266 A/cm 2 . The gradient grid electrode runs at 5/6 V/sub a/ (accel voltage). The suppressor electrode voltage is zero for V/sub a/ < 3 kV and -3 kV for V/sub a/ greater than or equal to 3 kV. The accel current density for optimum beam divergence is obtained for 1 less than or equal to V/sub a/ less than or equal to 80 kV, as are the beam divergence and emittance

  2. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Science.gov (United States)

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    R(2), 0.29). The relatively low levels of uncertainty among orthopaedic surgeons and confidence bias seem inconsistent with the paucity of definitive evidence. If patients want to be informed of the areas of uncertainty and surgeon-to-surgeon variation relevant to their care, it seems possible that a low recognition of uncertainty and surgeon confidence bias might hinder adequately informing patients, informed decisions, and consent. Moreover, limited recognition of uncertainty is associated with modifiable factors such as confidence bias, trust in orthopaedic evidence base, and statistical understanding. Perhaps improved statistical teaching in residency, journal clubs to improve the critique of evidence and awareness of bias, and acknowledgment of knowledge gaps at courses and conferences might create awareness about existing uncertainties. Level 1, prognostic study.

  3. Towards a different attitude to uncertainty

    Directory of Open Access Journals (Sweden)

    Guy Pe'er

    2014-10-01

    Full Text Available The ecological literature deals with uncertainty primarily from the perspective of how to reduce it to acceptable levels. However, the current rapid and ubiquitous environmental changes, as well as anticipated rates of change, pose novel conditions and complex dynamics due to which many sources of uncertainty are difficult or even impossible to reduce. These include both uncertainty in knowledge (epistemic uncertainty and societal responses to it. Under these conditions, an increasing number of studies ask how one can deal with uncertainty as it is. Here, we explore the question how to adopt an overall alternative attitude to uncertainty, which accepts or even embraces it. First, we show that seeking to reduce uncertainty may be counterproductive under some circumstances. It may yield overconfidence, ignoring early warning signs, policy- and societal stagnation, or irresponsible behaviour if personal certainty is offered by externalization of environmental costs. We then demonstrate that uncertainty can have positive impacts by driving improvements in knowledge, promoting cautious action, contributing to keeping societies flexible and adaptable, enhancing awareness, support and involvement of the public in nature conservation, and enhancing cooperation and communication. We discuss the risks of employing a certainty paradigm on uncertain knowledge, the potential benefits of adopting an alternative attitude to uncertainty, and the need to implement such an attitude across scales – from adaptive management at the local scale, to the evolving Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES at the global level.

  4. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    Science.gov (United States)

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Uncertainty analysis for hot channel

    International Nuclear Information System (INIS)

    Panka, I.; Kereszturi, A.

    2006-01-01

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  6. Uncertainty estimation of uranium determination in urine by fluorometry

    International Nuclear Information System (INIS)

    Shakhashiro, A.; Al-Khateeb, S.

    2003-11-01

    In this study an applicable mathematical model is proposed for the estimation of uncertainty in uranium determination by fluorometry in urine sample. The study based on EURACHEM guide for uncertainty estimation. This model was tested on a sample containing 0.02 μg/ml uranium, where calculated uncertainty was 0.007 μg/ml. The sources of uncertainty were shown on fish-bone plane as the following: In addition, the weight of each uncertainty parameter was shown in a histogram: Finally, it was found that the estimated uncertainty by the proposed model was 3 to 4 time more that the usually reported standard deviation. (author)

  7. Treatment and reporting of uncertainties for environmental radiation measurements

    International Nuclear Information System (INIS)

    Colle, R.

    1980-01-01

    Recommendations for a practical and uniform method for treating and reporting uncertainties in environmental radiation measurements data are presented. The method requires that each reported measurement result include the value, a total propagated random uncertainty expressed as the standard deviation, and a combined overall uncertainty. The uncertainty assessment should be based on as nearly a complete assessment as possible and should include every conceivable or likely source of inaccuracy in the result. Guidelines are given for estimating random and systematic uncertainty components, and for propagating and combining them to form an overall uncertainty

  8. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    Directory of Open Access Journals (Sweden)

    Artem Yankov

    2012-01-01

    Full Text Available For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor and in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.

  9. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Robertson, Amy N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-07-26

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, this paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.

  10. Uncertainties in radioecological assessment models

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  11. Harvest Regulations and Implementation Uncertainty in Small Game Harvest Management

    Directory of Open Access Journals (Sweden)

    Pål F. Moa

    2017-09-01

    Full Text Available A main challenge in harvest management is to set policies that maximize the probability that management goals are met. While the management cycle includes multiple sources of uncertainty, only some of these has received considerable attention. Currently, there is a large gap in our knowledge about implemention of harvest regulations, and to which extent indirect control methods such as harvest regulations are actually able to regulate harvest in accordance with intended management objectives. In this perspective article, we first summarize and discuss hunting regulations currently used in management of grouse species (Tetraonidae in Europe and North America. Management models suggested for grouse are most often based on proportional harvest or threshold harvest principles. These models are all built on theoretical principles for sustainable harvesting, and provide in the end an estimate on a total allowable catch. However, implementation uncertainty is rarely examined in empirical or theoretical harvest studies, and few general findings have been reported. Nevertheless, circumstantial evidence suggest that many of the most popular regulations are acting depensatory so that harvest bag sizes is more limited in years (or areas where game density is high, contrary to general recommendations. A better understanding of the implementation uncertainty related to harvest regulations is crucial in order to establish sustainable management systems. We suggest that scenario tools like Management System Evaluation (MSE should be more frequently used to examine robustness of currently applied harvest regulations to such implementation uncertainty until more empirical evidence is available.

  12. Radionuclide mass transfer rates from a pinhole in a waste container for an inventory-limited and a constant concentration source

    International Nuclear Information System (INIS)

    LeNeveu, D.M.

    1996-03-01

    Analytical solutions for transient and steady state diffusive mass transfer rates from a pinhole in a waste container are developed for constant concentration and inventory-limited source conditions. Mass transport in three media are considered, inside the pinhole (medium 2), outside the container (medium 3) and inside the container (medium 1). Simple equations are developed for radionuclide mass transfer rates from a pinhole. It is shown that the medium with the largest mass transfer resistance need only be considered to provide a conservative estimate of mass transfer rates. (author) 11 refs., 3 figs

  13. DS02 uncertainty analysis

    International Nuclear Information System (INIS)

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  14. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  15. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  16. Uncertainty in artificial intelligence

    CERN Document Server

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  17. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  18. An appraisal of uncertainties in the Western Australian wine industry supply chain

    OpenAIRE

    Islam, Nazrul; Quaddus, Mohammed

    2005-01-01

    Wine is one of the significant export items of Western Australia. In 2001/2002, the State’s wine exports amounted to about A$42 million. Despite its economic importance research on the supply chain aspects of WA wine industry is rather limited. This paper presents the sources of uncertainties in WA wine supply chain based on the results of an electronic focus group study with WA wine industry stakeholders. The group identified 74 items of uncertainties, which were then grouped into 26 unique ...

  19. Operational hydrological forecasting in Bavaria. Part I: Forecast uncertainty

    Science.gov (United States)

    Ehret, U.; Vogelbacher, A.; Moritz, K.; Laurent, S.; Meyer, I.; Haag, I.

    2009-04-01

    In Bavaria, operational flood forecasting has been established since the disastrous flood of 1999. Nowadays, forecasts based on rainfall information from about 700 raingauges and 600 rivergauges are calculated and issued for nearly 100 rivergauges. With the added experience of the 2002 and 2005 floods, awareness grew that the standard deterministic forecast, neglecting the uncertainty associated with each forecast is misleading, creating a false feeling of unambiguousness. As a consequence, a system to identify, quantify and communicate the sources and magnitude of forecast uncertainty has been developed, which will be presented in part I of this study. In this system, the use of ensemble meteorological forecasts plays a key role which will be presented in part II. Developing the system, several constraints stemming from the range of hydrological regimes and operational requirements had to be met: Firstly, operational time constraints obviate the variation of all components of the modeling chain as would be done in a full Monte Carlo simulation. Therefore, an approach was chosen where only the most relevant sources of uncertainty were dynamically considered while the others were jointly accounted for by static error distributions from offline analysis. Secondly, the dominant sources of uncertainty vary over the wide range of forecasted catchments: In alpine headwater catchments, typically of a few hundred square kilometers in size, rainfall forecast uncertainty is the key factor for forecast uncertainty, with a magnitude dynamically changing with the prevailing predictability of the atmosphere. In lowland catchments encompassing several thousands of square kilometers, forecast uncertainty in the desired range (usually up to two days) is mainly dependent on upstream gauge observation quality, routing and unpredictable human impact such as reservoir operation. The determination of forecast uncertainty comprised the following steps: a) From comparison of gauge

  20. Beam extraction dynamics at the space-charge-limit of the high brightness E-XFEL electron source at DESY-PITZ

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Ye; Gjonaj, Erion; Weiland, Thomas [TEMF, Technische Universitaet Darmstadt, Schlossgartenstrasse 8, 64289 Darmstadt (Germany)

    2015-07-01

    The physics of the photoemission, as one of the key issues for successful operation of linac based free-electron lasers like the European X-ray Free Electron Laser (E-XFEL) and the Free-electron Laser in Hamburg (FLASH), is playing an increasingly important role in the high brightness DESY-PITZ electron source. We study photoemission physics and discuss full three-dimensional numerical modeling of the electron bunch emission. The beam extraction dynamics at the photocathode has been investigated through the 3D fully electromagnetic (EM) Particle-in-Cell (PIC) solver of CST Particle Studio under the assumption of the photoemission source operating at or close to its space charge limit. PIC simulation results have shown good agreements with measurements on total emitted bunch charge for distinct experimental parameters. Further comparisons showed a general failure for the conventional Poisson solver based tracking algorithm to correctly predict the beam dynamics at the space charge limit. It is furthermore found, that fully EM PIC simulations are also consistent with a simple emission model based on the multidimensional Child-Langmuir law.

  1. Uncertainty estimation of ultrasonic thickness measurement

    International Nuclear Information System (INIS)

    Yassir Yassen, Abdul Razak Daud; Mohammad Pauzi Ismail; Abdul Aziz Jemain

    2009-01-01

    The most important factor that should be taken into consideration when selecting ultrasonic thickness measurement technique is its reliability. Only when the uncertainty of a measurement results is known, it may be judged if the result is adequate for intended purpose. The objective of this study is to model the ultrasonic thickness measurement function, to identify the most contributing input uncertainty components, and to estimate the uncertainty of the ultrasonic thickness measurement results. We assumed that there are five error sources significantly contribute to the final error, these sources are calibration velocity, transit time, zero offset, measurement repeatability and resolution, by applying the propagation of uncertainty law to the model function, a combined uncertainty of the ultrasonic thickness measurement was obtained. In this study the modeling function of ultrasonic thickness measurement was derived. By using this model the estimation of the uncertainty of the final output result was found to be reliable. It was also found that the most contributing input uncertainty components are calibration velocity, transit time linearity and zero offset. (author)

  2. Assessment of uncertainties in Neutron Multiplicity Counting

    International Nuclear Information System (INIS)

    Peerani, P.; Marin Ferrer, M.

    2008-01-01

    This paper describes a methodology for a complete and correct assessment of the errors coming from the uncertainty of each individual component on the final result. A general methodology accounting for all the main sources of error (both type-A and type-B) will be outlined. In order to better illustrate the method, a practical example applying it to the uncertainty estimation for a special case of multiplicity counter, the SNMC developed at JRC, will be given

  3. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...

  4. Sketching Uncertainty into Simulations.

    Science.gov (United States)

    Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E

    2012-12-01

    In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.

  5. Large-uncertainty intelligent states for angular momentum and angle

    International Nuclear Information System (INIS)

    Goette, Joerg B; Zambrini, Roberta; Franke-Arnold, Sonja; Barnett, Stephen M

    2005-01-01

    The equality in the uncertainty principle for linear momentum and position is obtained for states which also minimize the uncertainty product. However, in the uncertainty relation for angular momentum and angular position both sides of the inequality are state dependent and therefore the intelligent states, which satisfy the equality, do not necessarily give a minimum for the uncertainty product. In this paper, we highlight the difference between intelligent states and minimum uncertainty states by investigating a class of intelligent states which obey the equality in the angular uncertainty relation while having an arbitrarily large uncertainty product. To develop an understanding for the uncertainties of angle and angular momentum for the large-uncertainty intelligent states we compare exact solutions with analytical approximations in two limiting cases

  6. Decay heat uncertainty quantification of MYRRHA

    Directory of Open Access Journals (Sweden)

    Fiorito Luca

    2017-01-01

    Full Text Available MYRRHA is a lead-bismuth cooled MOX-fueled accelerator driven system (ADS currently in the design phase at SCK·CEN in Belgium. The correct evaluation of the decay heat and of its uncertainty level is very important for the safety demonstration of the reactor. In the first part of this work we assessed the decay heat released by the MYRRHA core using the ALEPH-2 burnup code. The second part of the study focused on the nuclear data uncertainty and covariance propagation to the MYRRHA decay heat. Radioactive decay data, independent fission yield and cross section uncertainties/covariances were propagated using two nuclear data sampling codes, namely NUDUNA and SANDY. According to the results, 238U cross sections and fission yield data are the largest contributors to the MYRRHA decay heat uncertainty. The calculated uncertainty values are deemed acceptable from the safety point of view as they are well within the available regulatory limits.

  7. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  8. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  9. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  10. Uncertainty in social dilemmas

    NARCIS (Netherlands)

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size

  11. Uncertainty and Climate Change

    OpenAIRE

    Berliner, L. Mark

    2003-01-01

    Anthropogenic, or human-induced, climate change is a critical issue in science and in the affairs of humankind. Though the target of substantial research, the conclusions of climate change studies remain subject to numerous uncertainties. This article presents a very brief review of the basic arguments regarding anthropogenic climate change with particular emphasis on uncertainty.

  12. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  13. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  14. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    Science.gov (United States)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data

  15. Sensitivity-Uncertainty Techniques for Nuclear Criticality Safety

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rising, Michael Evan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Alwin, Jennifer Louise [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-07

    The sensitivity and uncertainty analysis course will introduce students to keff sensitivity data, cross-section uncertainty data, how keff sensitivity data and keff uncertainty data are generated and how they can be used. Discussion will include how sensitivity/uncertainty data can be used to select applicable critical experiments, to quantify a defensible margin to cover validation gaps and weaknesses, and in development of upper subcritical limits.

  16. REDD+ emissions estimation and reporting: dealing with uncertainty

    International Nuclear Information System (INIS)

    Pelletier, Johanne; Potvin, Catherine; Martin, Davy

    2013-01-01

    The United Nations Framework Convention on Climate Change (UNFCCC) defined the technical and financial modalities of policy approaches and incentives to reduce emissions from deforestation and forest degradation in developing countries (REDD+). Substantial technical challenges hinder precise and accurate estimation of forest-related emissions and removals, as well as the setting and assessment of reference levels. These challenges could limit country participation in REDD+, especially if REDD+ emission reductions were to meet quality standards required to serve as compliance grade offsets for developed countries’ emissions. Using Panama as a case study, we tested the matrix approach proposed by Bucki et al (2012 Environ. Res. Lett. 7 024005) to perform sensitivity and uncertainty analysis distinguishing between ‘modelling sources’ of uncertainty, which refers to model-specific parameters and assumptions, and ‘recurring sources’ of uncertainty, which refers to random and systematic errors in emission factors and activity data. The sensitivity analysis estimated differences in the resulting fluxes ranging from 4.2% to 262.2% of the reference emission level. The classification of fallows and the carbon stock increment or carbon accumulation of intact forest lands were the two key parameters showing the largest sensitivity. The highest error propagated using Monte Carlo simulations was caused by modelling sources of uncertainty, which calls for special attention to ensure consistency in REDD+ reporting which is essential for securing environmental integrity. Due to the role of these modelling sources of uncertainty, the adoption of strict rules for estimation and reporting would favour comparability of emission reductions between countries. We believe that a reduction of the bias in emission factors will arise, among other things, from a globally concerted effort to improve allometric equations for tropical forests. Public access to datasets and methodology

  17. REDD+ emissions estimation and reporting: dealing with uncertainty

    Science.gov (United States)

    Pelletier, Johanne; Martin, Davy; Potvin, Catherine

    2013-09-01

    The United Nations Framework Convention on Climate Change (UNFCCC) defined the technical and financial modalities of policy approaches and incentives to reduce emissions from deforestation and forest degradation in developing countries (REDD+). Substantial technical challenges hinder precise and accurate estimation of forest-related emissions and removals, as well as the setting and assessment of reference levels. These challenges could limit country participation in REDD+, especially if REDD+ emission reductions were to meet quality standards required to serve as compliance grade offsets for developed countries’ emissions. Using Panama as a case study, we tested the matrix approach proposed by Bucki et al (2012 Environ. Res. Lett. 7 024005) to perform sensitivity and uncertainty analysis distinguishing between ‘modelling sources’ of uncertainty, which refers to model-specific parameters and assumptions, and ‘recurring sources’ of uncertainty, which refers to random and systematic errors in emission factors and activity data. The sensitivity analysis estimated differences in the resulting fluxes ranging from 4.2% to 262.2% of the reference emission level. The classification of fallows and the carbon stock increment or carbon accumulation of intact forest lands were the two key parameters showing the largest sensitivity. The highest error propagated using Monte Carlo simulations was caused by modelling sources of uncertainty, which calls for special attention to ensure consistency in REDD+ reporting which is essential for securing environmental integrity. Due to the role of these modelling sources of uncertainty, the adoption of strict rules for estimation and reporting would favour comparability of emission reductions between countries. We believe that a reduction of the bias in emission factors will arise, among other things, from a globally concerted effort to improve allometric equations for tropical forests. Public access to datasets and methodology

  18. Conditional uncertainty principle

    Science.gov (United States)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  19. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  20. Articulating uncertainty as part of scientific argumentation during model-based exoplanet detection tasks

    Science.gov (United States)

    Lee, Hee-Sun; Pallant, Amy; Pryputniewicz, Sarah

    2015-08-01

    Teaching scientific argumentation has emerged as an important goal for K-12 science education. In scientific argumentation, students are actively involved in coordinating evidence with theory based on their understanding of the scientific content and thinking critically about the strengths and weaknesses of the cited evidence in the context of the investigation. We developed a one-week-long online curriculum module called "Is there life in space?" where students conduct a series of four model-based tasks to learn how scientists detect extrasolar planets through the “wobble” and transit methods. The simulation model allows students to manipulate various parameters of an imaginary star and planet system such as planet size, orbit size, planet-orbiting-plane angle, and sensitivity of telescope equipment, and to adjust the display settings for graphs illustrating the relative velocity and light intensity of the star. Students can use model-based evidence to formulate an argument on whether particular signals in the graphs guarantee the presence of a planet. Students' argumentation is facilitated by the four-part prompts consisting of multiple-choice claim, open-ended explanation, Likert-scale uncertainty rating, and open-ended uncertainty rationale. We analyzed 1,013 scientific arguments formulated by 302 high school student groups taught by 7 teachers. We coded these arguments in terms of the accuracy of their claim, the sophistication of explanation connecting evidence to the established knowledge base, the uncertainty rating, and the scientific validity of uncertainty. We found that (1) only 18% of the students' uncertainty rationale involved critical reflection on limitations inherent in data and concepts, (2) 35% of students' uncertainty rationale reflected their assessment of personal ability and knowledge, rather than scientific sources of uncertainty related to the evidence, and (3) the nature of task such as the use of noisy data or the framing of

  1. Climate Certainties and Uncertainties

    International Nuclear Information System (INIS)

    Morel, Pierre

    2012-01-01

    In issue 380 of Futuribles in December 2011, Antonin Pottier analysed in detail the workings of what is today termed 'climate scepticism' - namely the propensity of certain individuals to contest the reality of climate change on the basis of pseudo-scientific arguments. He emphasized particularly that what fuels the debate on climate change is, largely, the degree of uncertainty inherent in the consequences to be anticipated from observation of the facts, not the description of the facts itself. In his view, the main aim of climate sceptics is to block the political measures for combating climate change. However, since they do not admit to this political posture, they choose instead to deny the scientific reality. This month, Futuribles complements this socio-psychological analysis of climate-sceptical discourse with an - in this case, wholly scientific - analysis of what we know (or do not know) about climate change on our planet. Pierre Morel gives a detailed account of the state of our knowledge in the climate field and what we are able to predict in the medium/long-term. After reminding us of the influence of atmospheric meteorological processes on the climate, he specifies the extent of global warming observed since 1850 and the main origin of that warming, as revealed by the current state of knowledge: the increase in the concentration of greenhouse gases. He then describes the changes in meteorological regimes (showing also the limits of climate simulation models), the modifications of hydrological regimes, and also the prospects for rises in sea levels. He also specifies the mechanisms that may potentially amplify all these phenomena and the climate disasters that might ensue. Lastly, he shows what are the scientific data that cannot be disregarded, the consequences of which are now inescapable (melting of the ice-caps, rises in sea level etc.), the only remaining uncertainty in this connection being the date at which these things will happen. 'In this

  2. The Dependency of Probabilistic Tsunami Hazard Assessment on Magnitude Limits of Seismic Sources in the South China Sea and Adjoining Basins

    Science.gov (United States)

    Li, Hongwei; Yuan, Ye; Xu, Zhiguo; Wang, Zongchen; Wang, Juncheng; Wang, Peitao; Gao, Yi; Hou, Jingming; Shan, Di

    2017-06-01

    The South China Sea (SCS) and its adjacent small basins including Sulu Sea and Celebes Sea are commonly identified as tsunami-prone region by its historical records on seismicity and tsunamis. However, quantification of tsunami hazard in the SCS region remained an intractable issue due to highly complex tectonic setting and multiple seismic sources within and surrounding this area. Probabilistic Tsunami Hazard Assessment (PTHA) is performed in the present study to evaluate tsunami hazard in the SCS region based on a brief review on seismological and tsunami records. 5 regional and local potential tsunami sources are tentatively identified, and earthquake catalogs are generated using Monte Carlo simulation following the Tapered Gutenberg-Richter relationship for each zone. Considering a lack of consensus on magnitude upper bound on each seismic source, as well as its critical role in PTHA, the major concern of the present study is to define the upper and lower limits of tsunami hazard in the SCS region comprehensively by adopting different corner magnitudes that could be derived by multiple principles and approaches, including TGR regression of historical catalog, fault-length scaling, tectonic and seismic moment balance, and repetition of historical largest event. The results show that tsunami hazard in the SCS and adjoining basins is subject to large variations when adopting different corner magnitudes, with the upper bounds 2-6 times of the lower. The probabilistic tsunami hazard maps for specified return periods reveal much higher threat from Cotabato Trench and Sulawesi Trench in the Celebes Sea, whereas tsunami hazard received by the coasts of the SCS and Sulu Sea is relatively moderate, yet non-negligible. By combining empirical method with numerical study of historical tsunami events, the present PTHA results are tentatively validated. The correspondence lends confidence to our study. Considering the proximity of major sources to population-laden cities

  3. On the uncertainty of phenological responses to climate change, and implications for a terrestrial biosphere model

    Directory of Open Access Journals (Sweden)

    M. Migliavacca

    2012-06-01

    Full Text Available Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate system through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere.

    Terrestrial biosphere models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we used the Harvard Forest phenology record to investigate and characterize sources of uncertainty in predicting phenology, and the subsequent impacts on model forecasts of carbon and water cycling. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species, with 12 leaf bud-burst models that varied in complexity.

    Akaike's Information Criterion indicated support for spring warming models with photoperiod limitations and, to a lesser extent, models that included chilling requirements.

    We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario. Parameter uncertainty was the smallest (average 95% Confidence Interval – CI: 2.4 days century−1 for scenario B1 and 4.5 days century−1 for A1fi, whereas driver uncertainty was the largest (up to 8.4 days century−1 in the simulated trends. The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied among models (±7.7 days century−1 for A1fi, ±3.6 days century−1 for B1. The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per

  4. CEC/USDOE workshop on uncertainty analysis

    International Nuclear Information System (INIS)

    Elderkin, C.E.; Kelly, G.N.

    1990-07-01

    Any measured or assessed quantity contains uncertainty. The quantitative estimation of such uncertainty is becoming increasingly important, especially in assuring that safety requirements are met in design, regulation, and operation of nuclear installations. The CEC/USDOE Workshop on Uncertainty Analysis, held in Santa Fe, New Mexico, on November 13 through 16, 1989, was organized jointly by the Commission of European Communities (CEC's) Radiation Protection Research program, dealing with uncertainties throughout the field of consequence assessment, and DOE's Atmospheric Studies in Complex Terrain (ASCOT) program, concerned with the particular uncertainties in time and space variant transport and dispersion. The workshop brought together US and European scientists who have been developing or applying uncertainty analysis methodologies, conducted in a variety of contexts, often with incomplete knowledge of the work of others in this area. Thus, it was timely to exchange views and experience, identify limitations of approaches to uncertainty and possible improvements, and enhance the interface between developers and users of uncertainty analysis methods. Furthermore, the workshop considered the extent to which consistent, rigorous methods could be used in various applications within consequence assessment. 3 refs

  5. Resolving uncertainty in chemical speciation determinations

    Science.gov (United States)

    Smith, D. Scott; Adams, Nicholas W. H.; Kramer, James R.

    1999-10-01

    Speciation determinations involve uncertainty in system definition and experimentation. Identification of appropriate metals and ligands from basic chemical principles, analytical window considerations, types of species and checking for consistency in equilibrium calculations are considered in system definition uncertainty. A systematic approach to system definition limits uncertainty in speciation investigations. Experimental uncertainty is discussed with an example of proton interactions with Suwannee River fulvic acid (SRFA). A Monte Carlo approach was used to estimate uncertainty in experimental data, resulting from the propagation of uncertainties in electrode calibration parameters and experimental data points. Monte Carlo simulations revealed large uncertainties present at high (>9-10) and low (monoprotic ligands. Least-squares fit the data with 21 sites, whereas linear programming fit the data equally well with 9 sites. Multiresponse fitting, involving simultaneous fluorescence and pH measurements, improved model discrimination. Deconvolution of the excitation versus emission fluorescence surface for SRFA establishes a minimum of five sites. Diprotic sites are also required for the five fluorescent sites, and one non-fluorescent monoprotic site was added to accommodate the pH data. Consistent with greater complexity, the multiresponse method had broader confidence limits than the uniresponse methods, but corresponded better with the accepted total carboxylic content for SRFA. Overall there was a 40% standard deviation in total carboxylic content for the multiresponse fitting, versus 10% and 1% for least-squares and linear programming, respectively.

  6. Believable statements of uncertainty and believable science

    International Nuclear Information System (INIS)

    Lindstrom, R.M.

    2017-01-01

    Nearly 50 years ago, two landmark papers appeared that should have cured the problem of ambiguous uncertainty statements in published data. Eisenhart's paper in Science called for statistically meaningful numbers, and Currie's Analytical Chemistry paper revealed the wide range in common definitions of detection limit. Confusion and worse can result when uncertainties are misinterpreted or ignored. The recent stories of cold fusion, variable radioactive decay, and piezonuclear reactions provide cautionary examples in which prior probability has been neglected. We show examples from our laboratory and others to illustrate the fact that uncertainty depends on both statistical and scientific judgment. (author)

  7. Development of a Dynamic Lidar Uncertainty Framework

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, Andrew [WindForS; Bonin, Timothy [CIRES/NOAA ESRL; Choukulkar, Aditya [CIRES/NOAA ESRL; Brewer, W. Alan [NOAA ESRL; Delgado, Ruben [University of Maryland Baltimore County

    2017-08-07

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing consider uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict

  8. Uncertainty in project phases: A framework for organisational change management

    DEFF Research Database (Denmark)

    Kreye, Melanie; Balangalibun, Sarah

    2015-01-01

    in the early stage of the change project but was delayed until later phases. Furthermore, the sources of uncertainty were found to be predominantly within the organisation that initiated the change project and connected to the project scope. Based on these findings, propositions for future research are defined......Uncertainty is an integral challenge when managing organisational change projects (OCPs). Current literature highlights the importance of uncertainty; however, falls short of giving insights into the nature of uncertainty and suggestions for managing it. Specifically, no insights exist on how...... uncertainty develops over the different phases of OCPs. This paper presents case-based evidence on different sources of uncertainty in OCPs and how these develop over the different project phases. The results showed some surprising findings as the majority of the uncertainty did not manifest itself...

  9. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    Science.gov (United States)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  10. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  11. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  12. Interactions between perceived uncertainty types in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2018-01-01

    to avoid business failure. A conceptual framework of four uncertainty types is investigated: environmental, technological, organisational, and relational uncertainty. We present insights from four empirical cases of service dyads collected via multiple sources of evidence including 54 semi-structured...... interviews, observations, and secondary data. The cases show seven interaction paths with direct knock-on effects between two uncertainty types and indirect knock-on effects between three or four uncertainty types. The findings suggest a causal chain from environmental, technological, organisational......, to relational uncertainty. This research contributes to the servitization literature by (i) con-firming the existence of uncertainty types, (ii) providing an in-depth characterisation of technological uncertainty, and (iii) showing the interaction paths between four uncertainty types in the form of a causal...

  13. Uncertainty In Measuring Noise Parameters Of a Communication Receiver

    International Nuclear Information System (INIS)

    Korcz, Karol; Palczynska, Beata; Spiralski, Ludwik

    2005-01-01

    The paper presents the method of assessing uncertainty in measuring the usable sensitivity Es of communication receiver. The influence of partial uncertainties of measuring the noise factor F and the energy pass band of the receiver Δf on the combined standard uncertainty level is analyzed. The method to assess the uncertainty in measuring the noise factor on the basis of the systematic component of uncertainty, assuming that the main source of measurement uncertainty is the hardware of the measuring system, is proposed. The assessment of uncertainty in measuring the pass band of the receiver is determined with the assumption that input quantities of the measurement equation are not correlated. They are successive, discrete values of the spectral power density of the noise on the output of receiver. The results of the analyses of particular uncertainties components of measuring the sensitivity, which were carried out for a typical communication receiver, are presented

  14. Treatment of uncertainties in the geologic disposal of radioactive waste

    International Nuclear Information System (INIS)

    Cranwell, R.M.

    1985-01-01

    Uncertainty in the analysis of geologic waste disposal is generally considered to have three primary components: (1) computer code/model uncertainty, (2) model parameter uncertainty, and (3) scenario uncertainty. Computer code/model uncertainty arises from problems associated with determination of appropriate parameters for use in model construction, mathematical formulatin of models, and numerical techniques used in conjunction with the mathematical formulation of models. Model parameter uncertainty arises from problems associated with selection of appropriate values for model input, data interpretation and possible misuse of data, and variation of data. Scenario uncertainty arises from problems associated with the ''completeness' of scenarios, the definition of parameters which describe scenarios, and the rate or probability of scenario occurrence. The preceding sources of uncertainty are discussed below

  15. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  16. Monte Carlo eigenfunction strategies and uncertainties

    International Nuclear Information System (INIS)

    Gast, R.C.; Candelore, N.R.

    1974-01-01

    Comparisons of convergence rates for several possible eigenfunction source strategies led to the selection of the ''straight'' analog of the analytic power method as the source strategy for Monte Carlo eigenfunction calculations. To insure a fair game strategy, the number of histories per iteration increases with increasing iteration number. The estimate of eigenfunction uncertainty is obtained from a modification of a proposal by D. B. MacMillan and involves only estimates of the usual purely statistical component of uncertainty and a serial correlation coefficient of lag one. 14 references. (U.S.)

  17. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    International Nuclear Information System (INIS)

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  18. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  19. Production scheduling of a lignite mine under quality and reserves uncertainty

    International Nuclear Information System (INIS)

    Galetakis, Michael; Roumpos, Christos; Alevizos, George; Vamvuka, Despina

    2011-01-01

    The effect of uncertainty sources to the stochastic optimization of the combined project of a new surface lignite mine exploitation and power plant operation for electricity generation is investigated. Major sources of uncertainty that were considered are the reserves and the quality of the lignite. Since probability distribution functions for these uncertainties were estimated during the detailed exploration phase of the deposit, the overall goal is then to determine the optimal capacity of the power plant and consequently the optimal production rate of the mine over the time. The optimization objective that was selected is the maximization of the net present value of the project. Emphasis is placed on the sensitivity analysis for the investigation of the effect of quality and reserves uncertainty on project optimization, on the mathematical formulation of risk attitude strategy and on increasing the efficiency of the optimization process by creating a limited set of feasible solutions applying empirical rules. The developed methodology was applied for the determination of the optimal annual production rate of a new surface lignite mine in the area of Ptolemais–Amynteon in Northern Greece. - Highlights: ► Quality and reserves uncertainty affects considerably the production scheduling. ► Stochastic optimization is greatly accelerated by incorporating Taylor's rule. ► Decisions can be made considering different risk level attitudes.

  20. Reprint of: Production scheduling of a lignite mine under quality and reserves uncertainty

    International Nuclear Information System (INIS)

    Galetakis, Michael; Roumpos, Christos; Alevizos, George; Vamvuka, Despina

    2012-01-01

    The effect of uncertainty sources to the stochastic optimization of the combined project of a new surface lignite mine exploitation and power plant operation for electricity generation is investigated. Major sources of uncertainty that were considered are the reserves and the quality of the lignite. Since probability distribution functions for these uncertainties were estimated during the detailed exploration phase of the deposit, the overall goal is then to determine the optimal capacity of the power plant and consequently the optimal production rate of the mine over the time. The optimization objective that was selected is the maximization of the net present value of the project. Emphasis is placed on the sensitivity analysis for the investigation of the effect of quality and reserves uncertainty on project optimization, on the mathematical formulation of risk attitude strategy and on increasing the efficiency of the optimization process by creating a limited set of feasible solutions applying empirical rules. The developed methodology was applied for the determination of the optimal annual production rate of a new surface lignite mine in the area of Ptolemais–Amynteon in Northern Greece. - Highlights: ► Quality and reserves uncertainty affects considerably the production scheduling. ► Stochastic optimization is greatly accelerated by incorporating Taylor's rule. ► Decisions can be made considering different risk level attitudes.

  1. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select su