WorldWideScience

Sample records for studies uncertainty evaluation

  1. Study on uncertainty evaluation system for the safety evaluation of interim spent fuel storage facility

    Kim, Myung Hyeon; Shin, Myeong Won; Rhy, Seok Jin; Cho, Dong Keon; Park, Dong Hwan [Kyunghee Univ., Seoul (Korea, Republic of); Cheong, Beom Jin [Minstry of Science and Technology, Gwacheon (Korea, Republic of)

    1998-03-15

    The main objective os to develop a technical standards for the facility operation of the interm, spent fuel storage facility and to develop a draft for the technical criteria to be legislated. The another objective os to define a uncertainty evaluation system for burn up credit application in criticality analysis and to investigate an applicability of this topic for future regulatory activity. Investigate a status of art for the operational criteria of spent fuel interm wet storage. Collect relevant laws, decree, notices and standards related to the operation of storage facility and study on the legislation system. Develop a draft of technical standards and criteria to be legislated. Define an evaluation system for the uncertainty analysis and study on the status of art in the field of criticality safety analysis. Develop an uncertainty evaluation system in criticality analysis with burnup credit and investigate an applicability as well as its benefits of this policy.

  2. Evaluating prediction uncertainty

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  3. Improving uncertainty evaluation of process models by using pedigree analysis. A case study on CO2 capture with monoethanolamine

    van der Spek, Mijndert; Ramirez, Andrea; Faaij, André

    2016-01-01

    This article aims to improve uncertainty evaluation of process models by combining a quantitative uncertainty evaluation method (data validation) with a qualitative uncertainty evaluation method (pedigree analysis). The approach is tested on a case study of monoethanolamine based postcombustion CO2

  4. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model

  5. Study on uncertainty evaluation methodology related to hydrological parameter of regional groundwater flow analysis model

    Sakai, Ryutaro; Munakata, Masahiro; Ohoka, Masao; Kameya, Hiroshi

    2009-11-01

    In the safety assessment for a geological disposal of radioactive waste, it is important to develop a methodology for long-term estimation of regional groundwater flow from data acquisition to numerical analyses. In the uncertainties associated with estimation of regional groundwater flow, there are the one that concerns parameters and the one that concerns the hydrologeological evolution. The uncertainties of parameters include measurement errors and their heterogeneity. The authors discussed the uncertainties of hydraulic conductivity as a significant parameter for regional groundwater flow analysis. This study suggests that hydraulic conductivities of rock mass are controlled by rock characteristics such as fractures, porosity and test conditions such as hydraulic gradient, water quality, water temperature and that there exists variations more than ten times in hydraulic conductivity by difference due to test conditions such as hydraulic gradient or due to rock type variations such as rock fractures, porosity. In addition this study demonstrated that confining pressure change caused by uplift and subsidence and change of hydraulic gradient under the long-term evolution of hydrogeological environment could possibly produce variations more than ten times of magnitude in hydraulic conductivity. It was also shown that the effect of water quality change on hydraulic conductivity was not negligible and that the replacement of fresh water and saline water caused by sea level change could induce 0.6 times in current hydraulic conductivities in case of Horonobe site. (author)

  6. Davis-Besse uncertainty study

    Davis, C.B.

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results

  7. Development of Evaluation Code for MUF Uncertainty

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  8. Development of Evaluation Code for MUF Uncertainty

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  9. Systematic Evaluation of Uncertainty in Material Flow Analysis

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2014-01-01

    Material flow analysis (MFA) is a tool to investigate material flows and stocks in defined systems as a basis for resource management or environmental pollution control. Because of the diverse nature of sources and the varying quality and availability of data, MFA results are inherently uncertain....... Uncertainty analyses have received increasing attention in recent MFA studies, but systematic approaches for selection of appropriate uncertainty tools are missing. This article reviews existing literature related to handling of uncertainty in MFA studies and evaluates current practice of uncertainty analysis......) and exploratory MFA (identification of critical parameters and system behavior). Whereas mathematically simpler concepts focusing on data uncertainty characterization are appropriate for descriptive MFAs, statistical approaches enabling more-rigorous evaluation of uncertainty and model sensitivity are needed...

  10. Routine for uncertainties evaluation: study case in mammography; Roteiro para avaliacao de incertezas: caso estudo em mamografia

    Peixoto, J.G.P. [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mail: guiherm@ird.gov.br; Almeida, C.E.V de [Universidade do Estado do Rio de Janeiro (UERJ/LCR), RJ (Brazil). Lab. de Ciencias Radiologicas

    2005-03-15

    This paper gives a orientation in the identification of uncertainties in the measurement object and the influence magnitudes which directly affect the results of measurements. The entry magnitudes presented in this work applies exclusively for this study case, and their use depends of the professional for the choice of those entry magnitudes.

  11. Uncertainty during breast diagnostic evaluation: state of the science.

    Montgomery, Mariann

    2010-01-01

    To present the state of the science on uncertainty in relationship to the experiences of women undergoing diagnostic evaluation for suspected breast cancer. Published articles from Medline, CINAHL, PubMED, and PsycINFO from 1983-2008 using the following key words: breast biopsy, mammography, uncertainty, reframing, inner strength, and disruption. Fifty research studies were examined with all reporting the presence of anxiety persisting throughout the diagnostic evaluation until certitude is achieved through the establishment of a definitive diagnosis. Indirect determinants of uncertainty for women undergoing breast diagnostic evaluation include measures of anxiety, depression, social support, emotional responses, defense mechanisms, and the psychological impact of events. Understanding and influencing the uncertainty experience have been suggested to be key in relieving psychosocial distress and positively influencing future screening behaviors. Several studies examine correlational relationships among anxiety, selection of coping methods, and demographic factors that influence uncertainty. A gap exists in the literature with regard to the relationship of inner strength and uncertainty. Nurses can be invaluable in assisting women in coping with the uncertainty experience by providing positive communication and support. Nursing interventions should be designed and tested for their effects on uncertainty experienced by women undergoing a breast diagnostic evaluation.

  12. Plurality of Type A evaluations of uncertainty

    Possolo, Antonio; Pintar, Adam L.

    2017-10-01

    The evaluations of measurement uncertainty involving the application of statistical methods to measurement data (Type A evaluations as specified in the Guide to the Expression of Uncertainty in Measurement, GUM) comprise the following three main steps: (i) developing a statistical model that captures the pattern of dispersion or variability in the experimental data, and that relates the data either to the measurand directly or to some intermediate quantity (input quantity) that the measurand depends on; (ii) selecting a procedure for data reduction that is consistent with this model and that is fit for the purpose that the results are intended to serve; (iii) producing estimates of the model parameters, or predictions based on the fitted model, and evaluations of uncertainty that qualify either those estimates or these predictions, and that are suitable for use in subsequent uncertainty propagation exercises. We illustrate these steps in uncertainty evaluations related to the measurement of the mass fraction of vanadium in a bituminous coal reference material, including the assessment of the homogeneity of the material, and to the calibration and measurement of the amount-of-substance fraction of a hydrochlorofluorocarbon in air, and of the age of a meteorite. Our goal is to expose the plurality of choices that can reasonably be made when taking each of the three steps outlined above, and to show that different choices typically lead to different estimates of the quantities of interest, and to different evaluations of the associated uncertainty. In all the examples, the several alternatives considered represent choices that comparably competent statisticians might make, but who differ in the assumptions that they are prepared to rely on, and in their selection of approach to statistical inference. They represent also alternative treatments that the same statistician might give to the same data when the results are intended for different purposes.

  13. Evaluation of uncertainty of adaptive radiation therapy

    Garcia Molla, R.; Gomez Martin, C.; Vidueira, L.; Juan-Senabre, X.; Garcia Gomez, R.

    2013-01-01

    This work is part of tests to perform to its acceptance in the clinical practice. The uncertainties of adaptive radiation, and which will separate the study, can be divided into two large parts: dosimetry in the CBCT and RDI. At each stage, their uncertainties are quantified and a level of action from which it would be reasonable to adapt the plan may be obtained with the total. (Author)

  14. Uncertainties in Transport Project Evaluation: Editorial

    Salling, Kim Bang; Nielsen, Otto Anker

    2015-01-01

    University of Denmark, September 2013. The conference was held under the auspices of the project ‘Uncertainties in transport project evaluation’ (UNITE) which is a research project (2009-2014) financed by the Danish Strategic Research Agency. UNITE was coordinated by the Department of Transport......This following special issue of the European Journal of Transport Infrastructure Research (EJTIR) containing five scientific papers is the result of an open call for papers at the 1st International Conference on Uncertainties in Transport Project Evaluation that took place at the Technical...

  15. Uncertainty Evaluation for SMART Synthesized Power Distribution

    Cho, J. Y.; Song, J. S.; Lee, C. C.; Park, S. Y.; Kim, K. Y.; Lee, K. H.

    2010-07-01

    This report performs the uncertainty analysis for the SMART synthesis power distribution generated by a SSUN (SMART core SUpporting system coupled by Nuclear design code) code. SSUN runs coupled with the MASTER neutronics code and generates the core 3-D synthesis power distribution by using DPCM3D. The MASTER code plays a role to provide the DPCM3D constants to the SSUN code for the current core states. The uncertainties evaluated in this report are the form of 95%/95% probability/confidence one-sided tolerance limits and can be used in conjunction with Technical Specification limits on these quantities to establish appropriate LCO (Limiting Conditions of Operation) and LSSS (Limiting Safety System Settings) limits. This report is applicable to SMART nuclear reactor using fixed rhodium detector systems. The unknown true power distribution should be given for the uncertainty evaluation of the synthesis power distribution. This report produces virtual distributions for the true power distribution by imposing the CASMO-3/MASTER uncertainty to the MASTER power distribution. Detector signals are generated from these virtual distribution and the DPCM3D constants are from the MASTER power distribution. The SSUN code synthesizes the core 3-D power distribution by using these detector signals and the DPCM3D constants. The following summarizes the uncertainty evaluation procedure for the synthesis power distribution. (1) Generation of 3-D power distribution by MASTER -> Determination of the DPCM3D constants. (2) Generation of virtual power distribution (assumed to be true power distribution) -> Generation of detector signals. (3) Generation of synthesis power distribution. (4) Uncertainty evaluation for the synthesis power distribution. Chi-Square normality test rejects the hypothesis of normal distribution for the synthesis power error distribution. Therefore, the KRUSKAL WALLIS test and the non-parametric statistics are used for data pooling and the tolerance limits. The

  16. A Bayesian approach for evaluation of the effect of water quality model parameter uncertainty on TMDLs: A case study of Miyun Reservoir

    Liang, Shidong; Jia, Haifeng; Xu, Changqing; Xu, Te; Melching, Charles

    2016-01-01

    ) determination. The sources of uncertainty are discussed and ways to reduce the uncertainties are proposed. - Highlights: • Effect of water quality model parameter uncertainty on TMDLs was evaluated. • A Bayesian approach was used for the model parameter uncertainty analysis. • DREAM algorithm, a multi-chain MCMC, was used as the Bayesian approach. • Miyun Reservoir, the most important drinking water source for Beijing, was studied. • Wide ranges of allowable loads were obtained through uncertainty propagation.

  17. Uncertainty Evaluation of Best Estimate Calculation Results

    Glaeser, H.

    2006-01-01

    Efforts are underway in Germany to perform analysis using best estimate computer codes and to include uncertainty evaluation in licensing. The German Reactor Safety Commission (RSK) issued a recommendation to perform uncertainty analysis in loss of coolant accident safety analyses (LOCA), recently. A more general requirement is included in a draft revision of the German Nuclear Regulation which is an activity of the German Ministry of Environment and Reactor Safety (BMU). According to the recommendation of the German RSK to perform safety analyses for LOCA in licensing the following deterministic requirements have still to be applied: Most unfavourable single failure, Unavailability due to preventive maintenance, Break location, Break size and break type, Double ended break, 100 percent through 200 percent, Large, medium and small break, Loss of off-site power, Core power (at accident initiation the most unfavourable conditions and values have to be assumed which may occur under normal operation taking into account the set-points of integral power and power density control. Measurement and calibration errors can be considered statistically), Time of fuel cycle. Analysis using best estimate codes with evaluation of uncertainties is the only way to quantify conservatisms with regard to code models and uncertainties of plant, fuel parameters and decay heat. This is especially the case for approaching licensing limits, e.g. due to power up-rates, higher burn-up and higher enrichment. Broader use of best estimate analysis is therefore envisaged in the future. Since some deterministic unfavourable assumptions regarding availability of NPP systems are still used, some conservatism in best-estimate analyses remains. Methods of uncertainty analyses have been developed and applied by the vendor Framatome ANP as well as by GRS in Germany. The GRS development was sponsored by the German Ministry of Economy and Labour (BMWA). (author)

  18. Approach to uncertainty evaluation for safety analysis

    Ogura, Katsunori

    2005-01-01

    Nuclear power plant safety used to be verified and confirmed through accident simulations using computer codes generally because it is very difficult to perform integrated experiments or tests for the verification and validation of the plant safety due to radioactive consequence, cost, and scaling to the actual plant. Traditionally the plant safety had been secured owing to the sufficient safety margin through the conservative assumptions and models to be applied to those simulations. Meanwhile the best-estimate analysis based on the realistic assumptions and models in support of the accumulated insights could be performed recently, inducing the reduction of safety margin in the analysis results and the increase of necessity to evaluate the reliability or uncertainty of the analysis results. This paper introduces an approach to evaluate the uncertainty of accident simulation and its results. (Note: This research had been done not in the Japan Nuclear Energy Safety Organization but in the Tokyo Institute of Technology.) (author)

  19. A study on evaluation strategies in dimensional X-ray computed tomography by estimation of measurement uncertainties

    Müller, Pavel; Hiller, Jochen; Cantatore, Angela

    2012-01-01

    Computed tomography has entered the industrial world in 1980’s as a technique for nondestructive testing and has nowadays become a revolutionary tool for dimensional metrology, suitable for actual/nominal comparison and verification of geometrical and dimensional tolerances. This paper evaluates...... measurement results using different measuring strategies applied in different inspection software packages for volume and surface data analysis. The strategy influence is determined by calculating the measurement uncertainty. This investigation includes measurements of two industrial items, an aluminium pipe...... connector and a plastic toggle, a hearing aid component. These are measured using a commercial CT scanner. Traceability is transferred using tactile and optical coordinate measuring machines, which are used to produce reference measurements. Results show that measurements of diameter for both parts resulted...

  20. Supporting Qualified Database for Uncertainty Evaluation

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; Lisovyy, O.; D'Auria, F.

    2013-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The 'RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering Handbook) of the input nodalization

  1. Supporting qualified database for uncertainty evaluation

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; D' Auria, F. [Nuclear Research Group of San Piero A Grado, Univ. of Pisa, Via Livornese 1291, 56122 Pisa (Italy)

    2012-07-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QR' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  2. Approximate Bayesian evaluations of measurement uncertainty

    Possolo, Antonio; Bodnar, Olha

    2018-04-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) includes formulas that produce an estimate of a scalar output quantity that is a function of several input quantities, and an approximate evaluation of the associated standard uncertainty. This contribution presents approximate, Bayesian counterparts of those formulas for the case where the output quantity is a parameter of the joint probability distribution of the input quantities, also taking into account any information about the value of the output quantity available prior to measurement expressed in the form of a probability distribution on the set of possible values for the measurand. The approximate Bayesian estimates and uncertainty evaluations that we present have a long history and illustrious pedigree, and provide sufficiently accurate approximations in many applications, yet are very easy to implement in practice. Differently from exact Bayesian estimates, which involve either (analytical or numerical) integrations, or Markov Chain Monte Carlo sampling, the approximations that we describe involve only numerical optimization and simple algebra. Therefore, they make Bayesian methods widely accessible to metrologists. We illustrate the application of the proposed techniques in several instances of measurement: isotopic ratio of silver in a commercial silver nitrate; odds of cryptosporidiosis in AIDS patients; height of a manometer column; mass fraction of chromium in a reference material; and potential-difference in a Zener voltage standard.

  3. Uncertainty sources in radiopharmaceuticals clinical studies

    Degenhardt, Aemilie Louize; Oliveira, Silvia Maria Velasques de

    2014-01-01

    The radiopharmaceuticals should be approved for consumption by evaluating their quality, safety and efficacy. Clinical studies are designed to verify the pharmacodynamics, pharmacological and clinical effects in humans and are required for assuring safety and efficacy. The Bayesian analysis has been used for clinical studies effectiveness evaluation. This work aims to identify uncertainties associated with the process of production of the radionuclide and radiopharmaceutical labelling as well as the radiopharmaceutical administration and scintigraphy images acquisition and processing. For the development of clinical studies in the country, the metrological chain shall assure the traceability of the surveys performed in all phases. (author)

  4. Evaluation of uncertainty and detection limits in radioactivity measurements

    Herranz, M. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain); Idoeta, R. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain)], E-mail: raquel.idoeta@ehu.es; Legarda, F. [Universidad del Pais Vasco/Euskal Herriko Unibertsitatea, Escuela Tecnica Superior de Ingenieria de Bilbao, Alda. Urquijo, s/n, 48013 Bilbao (Spain)

    2008-10-01

    The uncertainty associated with the assessment of the radioactive content of any sample depends on the net counting rate registered during the measuring process and on the different weighting factors needed to transform this counting rate into activity, activity per unit mass or activity concentration. This work analyses the standard uncertainties in these weighting factors as well as their contribution to the uncertainty in the activity reported for three typical determinations for environmental radioactivity measurements in the laboratory. It also studies the corresponding characteristic limits and their dependence on the standard uncertainty related to those weighting factors, offering an analysis of the effectiveness of the simplified characteristic limits as evaluated by various measuring software and laboratories.

  5. Evaluation of uncertainty and detection limits in radioactivity measurements

    Herranz, M.; Idoeta, R.; Legarda, F.

    2008-01-01

    The uncertainty associated with the assessment of the radioactive content of any sample depends on the net counting rate registered during the measuring process and on the different weighting factors needed to transform this counting rate into activity, activity per unit mass or activity concentration. This work analyses the standard uncertainties in these weighting factors as well as their contribution to the uncertainty in the activity reported for three typical determinations for environmental radioactivity measurements in the laboratory. It also studies the corresponding characteristic limits and their dependence on the standard uncertainty related to those weighting factors, offering an analysis of the effectiveness of the simplified characteristic limits as evaluated by various measuring software and laboratories

  6. ESFR core optimization and uncertainty studies

    Rineiski, A.; Vezzoni, B.; Zhang, D.; Marchetti, M.; Gabrielli, F.; Maschek, W.; Chen, X.-N.; Buiron, L.; Krepel, J.; Sun, K.; Mikityuk, K.; Polidoro, F.; Rochman, D.; Koning, A.J.; DaCruz, D.F.; Tsige-Tamirat, H.; Sunderland, R.

    2015-01-01

    In the European Sodium Fast Reactor (ESFR) project supported by EURATOM in 2008-2012, a concept for a large 3600 MWth sodium-cooled fast reactor design was investigated. In particular, reference core designs with oxide and carbide fuel were optimized to improve their safety parameters. Uncertainties in these parameters were evaluated for the oxide option. Core modifications were performed first to reduce the sodium void reactivity effect. Introduction of a large sodium plenum with an absorber layer above the core and a lower axial fertile blanket improve the total sodium void effect appreciably, bringing it close to zero for a core with fresh fuel, in line with results obtained worldwide, while not influencing substantially other core physics parameters. Therefore an optimized configuration, CONF2, with a sodium plenum and a lower blanket was established first and used as a basis for further studies in view of deterioration of safety parameters during reactor operation. Further options to study were an inner fertile blanket, introduction of moderator pins, a smaller core height, special designs for pins, such as 'empty' pins, and subassemblies. These special designs were proposed to facilitate melted fuel relocation in order to avoid core re-criticality under severe accident conditions. In the paper further CONF2 modifications are compared in terms of safety and fuel balance. They may bring further improvements in safety, but their accurate assessment requires additional studies, including transient analyses. Uncertainty studies were performed by employing a so-called Total Monte-Carlo method, for which a large number of nuclear data files is produced for single isotopes and then used in Monte-Carlo calculations. The uncertainties for the criticality, sodium void and Doppler effects, effective delayed neutron fraction due to uncertainties in basic nuclear data were assessed for an ESFR core. They prove applicability of the available nuclear data for ESFR

  7. Evaluation of uncertainties in benefit-cost studies of electrical power plants. II. Development and application of a procedure for quantifying environmental uncertainties of a nuclear power plant. Final report

    Sullivan, W.G.

    1977-07-01

    Steam-electric generation plants are evaluated on a benefit-cost basis. Non-economic factors in the development and application of a procedure for quantifying environmental uncertainties of a nuclear power plant are discussed. By comparing monetary costs of a particular power plant assessed in Part 1 with non-monetary values arrived at in Part 2 and using an evaluation procedure developed in this study, a proposed power plant can be selected as a preferred alternative. This procedure enables policymakers to identify the incremental advantages and disadvantages of different power plants in view of their geographic locations. The report presents the evaluation procedure on a task by task basis and shows how it can be applied to a particular power plant. Because of the lack of objective data, it draws heavily on subjectively-derived inputs of individuals who are knowledgeable about the plant being investigated. An abbreviated study at another power plant demonstrated the transferability of the general evaluation procedure. Included in the appendices are techniques for developing scoring functions and a user's manual for the Fortran IV Program

  8. Visual Semiotics & Uncertainty Visualization: An Empirical Study.

    MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M

    2012-12-01

    This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.

  9. Report on the uncertainty methods study

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  10. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  11. Some sources of the underestimation of evaluated cross section uncertainties

    Badikov, S.A.; Gai, E.V.

    2003-01-01

    The problem of the underestimation of evaluated cross-section uncertainties is addressed. Two basic sources of the underestimation of evaluated cross-section uncertainties - a) inconsistency between declared and observable experimental uncertainties and b) inadequacy between applied statistical models and processed experimental data - are considered. Both the sources of the underestimation are mainly a consequence of existence of the uncertainties unrecognized by experimenters. A model of a 'constant shift' is proposed for taking unrecognised experimental uncertainties into account. The model is applied for statistical analysis of the 238 U(n,f)/ 235 U(n,f) reaction cross-section ratio measurements. It is demonstrated that multiplication by sqrt(χ 2 ) as instrument for correction of underestimated evaluated cross-section uncertainties fails in case of correlated measurements. It is shown that arbitrary assignment of uncertainties and correlation in a simple least squares fit of two correlated measurements of unknown mean leads to physically incorrect evaluated results. (author)

  12. An evaluation of uncertainties in radioecological models

    Hoffmann, F.O.; Little, C.A.; Miller, C.W.; Dunning, D.E. Jr.; Rupp, E.M.; Shor, R.W.; Schaeffer, D.L.; Baes, C.F. III

    1978-01-01

    The paper presents results of analyses for seven selected parameters commonly used in environmental radiological assessment models, assuming that the available data are representative of the true distribution of parameter values and that their respective distributions are lognormal. Estimates of the most probable, median, mean, and 99th percentile for each parameter are fiven and compared to U.S. NRC default values. The regulatory default values are generally greater than the median values for the selected parameters, but some are associated with percentiles significantly less than the 50th. The largest uncertainties appear to be associated with aquatic bioaccumulation factors for fresh water fish. Approximately one order of magnitude separates median values and values of the 99th percentile. The uncertainty is also estimated for the annual dose rate predicted by a multiplicative chain model for the transport of molecular iodine-131 via the air-pasture-cow-milk-child's thyroid pathway. The value for the 99th percentile is ten times larger than the median value of the predicted dose normalized for a given air concentration of 131 I 2 . About 72% of the uncertainty in this model is contributed by the dose conversion factor and the milk transfer coefficient. Considering the difficulties in obtaining a reliable quantification of the true uncertainties in model predictions, methods for taking these uncertainties into account when determining compliance with regulatory statutes are discussed. (orig./HP) [de

  13. Evaluation of uncertainty in dosimetry of irradiator system

    Santos, Gelson P.; Potiens, Maria P.A.; Vivolo, Vitor

    2005-01-01

    This paper describes the study of uncertainties in the estimates of dosimetry irradiator system STS 0B85 of LCI IPEN/CNEN-SP. This study is relevant for determination of best measurement capability when the laboratory performs routine calibrations of measuring radiation next the optimal measures designed to radioprotection. It is also a requirement for obtaining the accreditation of the laboratory by the INMETRO. For this dosimetry was used a reference system of the laboratory composed of a electrometer and a spherical ionization chamber of 1 liter. Measurements were made at five distances selected so to include the whole range of the optical bench tests and using three attenuators filters so as to extend the measurement capability. The magnitude used for evaluation was the rate of air kerma for 1 37C s and 6 0C o beams. Were carried out four series of measurements. It was verified the inverse square law to these series and their sets of uncertainty. Unfiltered, with one and two filters series showed good agreement with the inverse square low and the maximum uncertainty obtained was approximately 1.7%. In series with all the filters was a major deviation of the inverse square law and wide increase in uncertainty to measurements at the end of the optical bench

  14. Uncertainty analysis for Ulysses safety evaluation report

    Frank, M.V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source---Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show that INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low

  15. Approaches to Evaluating Probability of Collision Uncertainty

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  16. Evaluating measurement uncertainty in fluid phase equilibrium calculations

    van der Veen, Adriaan M. H.

    2018-04-01

    The evaluation of measurement uncertainty in accordance with the ‘Guide to the expression of uncertainty in measurement’ (GUM) has not yet become widespread in physical chemistry. With only the law of the propagation of uncertainty from the GUM, many of these uncertainty evaluations would be cumbersome, as models are often non-linear and require iterative calculations. The methods from GUM supplements 1 and 2 enable the propagation of uncertainties under most circumstances. Experimental data in physical chemistry are used, for example, to derive reference property data and support trade—all applications where measurement uncertainty plays an important role. This paper aims to outline how the methods for evaluating and propagating uncertainty can be applied to some specific cases with a wide impact: deriving reference data from vapour pressure data, a flash calculation, and the use of an equation-of-state to predict the properties of both phases in a vapour-liquid equilibrium. The three uncertainty evaluations demonstrate that the methods of GUM and its supplements are a versatile toolbox that enable us to evaluate the measurement uncertainty of physical chemical measurements, including the derivation of reference data, such as the equilibrium thermodynamical properties of fluids.

  17. Evaluation of Sources of Uncertainties in Solar Resource Measurement

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-25

    This poster presents a high-level overview of sources of uncertainties in solar resource measurement, demonstrating the impact of various sources of uncertainties -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of data from several radiometers. The study provides insight on how to reduce the impact of some of the sources of uncertainties.

  18. Classification and moral evaluation of uncertainties in engineering modeling.

    Murphy, Colleen; Gardoni, Paolo; Harris, Charles E

    2011-09-01

    Engineers must deal with risks and uncertainties as a part of their professional work and, in particular, uncertainties are inherent to engineering models. Models play a central role in engineering. Models often represent an abstract and idealized version of the mathematical properties of a target. Using models, engineers can investigate and acquire understanding of how an object or phenomenon will perform under specified conditions. This paper defines the different stages of the modeling process in engineering, classifies the various sources of uncertainty that arise in each stage, and discusses the categories into which these uncertainties fall. The paper then considers the way uncertainty and modeling are approached in science and the criteria for evaluating scientific hypotheses, in order to highlight the very different criteria appropriate for the development of models and the treatment of the inherent uncertainties in engineering. Finally, the paper puts forward nine guidelines for the treatment of uncertainty in engineering modeling.

  19. Evaluation of cutting force uncertainty components in turning

    Axinte, Dragos Aurelian; Belluco, Walter; De Chiffre, Leonardo

    2000-01-01

    A procedure is proposed for the evaluation of those uncertainty components of a single cutting force measurement in turning that are related to the contributions of the dynamometer calibration and the cutting process itself. Based on an empirical model including errors form both sources......, the uncertainty for a single measurement of cutting force is presented, and expressions for the expected uncertainty vs. cutting parameters are proposed. This approach gives the possibility of evaluating cutting force uncertainty components in turning, for a defined range of cutting parameters, based on few...

  20. Evaluation of nuclear data and their uncertainties

    Story, J.S.

    1984-01-01

    Some topics studied within the Winfrith Nuclear Data Group in recent years, and still of current importance, are briefly reviewed. Moderator cross-sections; criteria to be met for reactor applications are listed; thermal neutron scattering theory is summarized, with the approximations used to facilitate comutation; neutron age data test stringently the accuracy of epithermal cross-sections; a modification of the CFS effective range treatment for S-wave scatter by H is presented, and new calculations with up-to-date slow neutron scattering data are advocated. Use of multilevel resonance formalisms; the top bound resonance should be included explicitly in calculations; additive statistical terms are given to allow for ''distant'' negative and positive resonances, in both MLBW and R-M formalisms; formulae are presented for estimating R-M level shifts for 1>0 resonances. Resonance mean spacings; the Syson-Mehta optimum estimator is utilised in a method which up-dates the staircase plot. Resonances of 56 Fe have been resolved to approx. 800keV, over which range the level density for given Jπ should increase 2-fold; this variation is allowed for in the mean spacing calculations. Fission-product decay power; present status of integral data and summation calculations for 235 U and 239 Pu fissions is summarized, with a variety of intercomparisons including 239 Pu/ 235 U ratios. Data uncertainties are considered, but the sequence of data on GAMMAsub(γ) for the 27.8keV resonance of 56 Fe provided a cautionary example. (author)

  1. CREOLE experiment study on the reactivity temperature coefficient with sensitivity and uncertainty analysis using the MCNP5 code and different neutron cross section evaluations

    Boulaich, Y.; El Bardouni, T.; Erradi, L.; Chakir, E.; Boukhal, H.; Nacir, B.; El Younoussi, C.; El Bakkari, B.; Merroun, O.; Zoubair, M.

    2011-01-01

    Highlights: → In the present work, we have analyzed the CREOLE experiment on the reactivity temperature coefficient (RTC) by using the three-dimensional continuous energy code (MCNP5) and the last updated nuclear data evaluations. → Calculation-experiment discrepancies of the RTC were analyzed and the results have shown that the JENDL3.3 and JEFF3.1 evaluations give the most consistent values. → In order to specify the source of the relatively large discrepancy in the case of ENDF-BVII nuclear data evaluation, the k eff discrepancy between ENDF-BVII and JENDL3.3 was decomposed by using sensitivity and uncertainty analysis technique. - Abstract: In the present work, we analyze the CREOLE experiment on the reactivity temperature coefficient (RTC) by using the three-dimensional continuous energy code (MCNP5) and the last updated nuclear data evaluations. This experiment performed in the EOLE critical facility located at CEA/Cadarache, was mainly dedicated to the RTC studies for both UO 2 and UO 2 -PuO 2 PWR type lattices covering the whole temperature range from 20 deg. C to 300 deg. C. We have developed an accurate 3D model of the EOLE reactor by using the MCNP5 Monte Carlo code which guarantees a high level of fidelity in the description of different configurations at various temperatures taking into account their consequence on neutron cross section data and all thermal expansion effects. In this case, the remaining error between calculation and experiment will be awarded mainly to uncertainties on nuclear data. Our own cross section library was constructed by using NJOY99.259 code with point-wise nuclear data based on ENDF-BVII, JEFF3.1 and JENDL3.3 evaluation files. The MCNP model was validated through the axial and radial fission rate measurements at room and hot temperatures. Calculation-experiment discrepancies of the RTC were analyzed and the results have shown that the JENDL3.3 and JEFF3.1 evaluations give the most consistent values; the discrepancy is

  2. Evaluation of Uncertainty in Precipitation Datasets for New Mexico, USA

    Besha, A. A.; Steele, C. M.; Fernald, A.

    2014-12-01

    Climate change, population growth and other factors are endangering water availability and sustainability in semiarid/arid areas particularly in the southwestern United States. Wide coverage of spatial and temporal measurements of precipitation are key for regional water budget analysis and hydrological operations which themselves are valuable tool for water resource planning and management. Rain gauge measurements are usually reliable and accurate at a point. They measure rainfall continuously, but spatial sampling is limited. Ground based radar and satellite remotely sensed precipitation have wide spatial and temporal coverage. However, these measurements are indirect and subject to errors because of equipment, meteorological variability, the heterogeneity of the land surface itself and lack of regular recording. This study seeks to understand precipitation uncertainty and in doing so, lessen uncertainty propagation into hydrological applications and operations. We reviewed, compared and evaluated the TRMM (Tropical Rainfall Measuring Mission) precipitation products, NOAA's (National Oceanic and Atmospheric Administration) Global Precipitation Climatology Centre (GPCC) monthly precipitation dataset, PRISM (Parameter elevation Regression on Independent Slopes Model) data and data from individual climate stations including Cooperative Observer Program (COOP), Remote Automated Weather Stations (RAWS), Soil Climate Analysis Network (SCAN) and Snowpack Telemetry (SNOTEL) stations. Though not yet finalized, this study finds that the uncertainty within precipitation estimates datasets is influenced by regional topography, season, climate and precipitation rate. Ongoing work aims to further evaluate precipitation datasets based on the relative influence of these phenomena so that we can identify the optimum datasets for input to statewide water budget analysis.

  3. Probabilistic evaluation of uncertainties and risks in aerospace components

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  4. Uncertainty evaluation methods for waste package performance assessment

    Wu, Y.T.; Nair, P.K.; Journel, A.G.; Abramson, L.R.

    1991-01-01

    This report identifies and investigates methodologies to deal with uncertainties in assessing high-level nuclear waste package performance. Four uncertainty evaluation methods (probability-distribution approach, bounding approach, expert judgment, and sensitivity analysis) are suggested as the elements of a methodology that, without either diminishing or enhancing the input uncertainties, can evaluate performance uncertainty. Such a methodology can also help identify critical inputs as a guide to reducing uncertainty so as to provide reasonable assurance that the risk objectives are met. This report examines the current qualitative waste containment regulation and shows how, in conjunction with the identified uncertainty evaluation methodology, a framework for a quantitative probability-based rule can be developed that takes account of the uncertainties. Current US Nuclear Regulatory Commission (NRC) regulation requires that the waste packages provide ''substantially complete containment'' (SCC) during the containment period. The term ''SCC'' is ambiguous and subject to interpretation. This report, together with an accompanying report that describes the technical considerations that must be addressed to satisfy high-level waste containment requirements, provides a basis for a third report to develop recommendations for regulatory uncertainty reduction in the ''containment''requirement of 10 CFR Part 60. 25 refs., 3 figs., 2 tabs

  5. Small break LOCA RELAP5/MOD3 uncertainty quantification: Bias and uncertainty evaluation for important phenomena

    Ortiz, M.G.; Ghan, L.S.; Vogl, J.

    1991-01-01

    The Nuclear Regulatory Commission (NRC) revised the Emergency Core Cooling System (ECCS) licensing rule to allow the use of Best Estimate (BE) computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability and Uncertainty (CSAU) to evaluate BE code uncertainties. The CSAU methodology was demonstrated with a specific application to a pressurized water reactor (PWR), experiencing a postulated large break loss-of-coolant accident (LBLOCA). The current work is part of an effort to adapt and demonstrate the CSAU methodology to a small break (SB) LOCA in a PWR of B and W design using RELAP5/MOD3 as the simulation tool. The subject of this paper is the Assessment and Ranging of Parameters (Element 2 of the CSAU methodology), which determines the contribution to uncertainty of specific models in the code

  6. Survey and Evaluate Uncertainty Quantification Methodologies

    Lin, Guang; Engel, David W.; Eslinger, Paul W.

    2012-02-01

    The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and academic institutions that will develop and deploy state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technologies from discovery to development, demonstration, and ultimately the widespread deployment to hundreds of power plants. The CCSI Toolset will provide end users in industry with a comprehensive, integrated suite of scientifically validated models with uncertainty quantification, optimization, risk analysis and decision making capabilities. The CCSI Toolset will incorporate commercial and open-source software currently in use by industry and will also develop new software tools as necessary to fill technology gaps identified during execution of the project. The CCSI Toolset will (1) enable promising concepts to be more quickly identified through rapid computational screening of devices and processes; (2) reduce the time to design and troubleshoot new devices and processes; (3) quantify the technical risk in taking technology from laboratory-scale to commercial-scale; and (4) stabilize deployment costs more quickly by replacing some of the physical operational tests with virtual power plant simulations. The goal of CCSI is to deliver a toolset that can simulate the scale-up of a broad set of new carbon capture technologies from laboratory scale to full commercial scale. To provide a framework around which the toolset can be developed and demonstrated, we will focus on three Industrial Challenge Problems (ICPs) related to carbon capture technologies relevant to U.S. pulverized coal (PC) power plants. Post combustion capture by solid sorbents is the technology focus of the initial ICP (referred to as ICP A). The goal of the uncertainty quantification (UQ) task (Task 6) is to provide a set of capabilities to the user community for the quantification of uncertainties associated with the carbon

  7. Uncertainty in BMP evaluation and optimization for watershed management

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT

  8. Evaluation of Uncertainties in the Determination of Phosphorus by RNAA

    Rick L. Paul

    2000-01-01

    A radiochemical neutron activation analysis (RNAA) procedure for the determination of phosphorus in metals and other materials has been developed and critically evaluated. Uncertainties evaluated as type A include those arising from measurement replication, yield determination, neutron self-shielding, irradiation geometry, measurement of the quantity for concentration normalization (sample mass, area, etc.), and analysis of standards. Uncertainties evaluated as type B include those arising from beta contamination corrections, beta decay curve fitting, and beta self-absorption corrections. The evaluation of uncertainties in the determination of phosphorus is illustrated for three different materials in Table I. The metal standard reference materials (SRMs) 2175 and 861 were analyzed for value assignment of phosphorus; implanted silicon was analyzed to evaluate the technique for certification of phosphorus. The most significant difference in the error evaluation of the three materials lies in the type B uncertainties. The relatively uncomplicated matrix of the high-purity silicon allows virtually complete purification of phosphorus from other beta emitters; hence, minimal contamination correction is needed. Furthermore, because the chemistry is less rigorous, the carrier yield is more reproducible, and self-absorption corrections are less significant. Improvements in the chemical purification procedures for phosphorus in complex matrices will decrease the type B uncertainties for all samples. Uncertainties in the determination of carrier yield, the most significant type A error in the analysis of the silicon, also need to be evaluated more rigorously and minimized in the future

  9. Geostatistical evaluation of travel time uncertainties

    Devary, J.L.

    1983-08-01

    Data on potentiometric head and hydraulic conductivity, gathered from the Wolfcamp Formation of the Permian System, have exhibited tremendous spatial variability as a result of heterogeneities in the media and the presence of petroleum and natural gas deposits. Geostatistical data analysis and error propagation techniques (kriging and conditional simulation) were applied to determine the effect of potentiometric head uncertainties on radionuclide travel paths and travel times through the Wolfcamp Formation. Blok-average kriging was utilized to remove measurement error from potentiometric head data. The travel time calculations have been enhanced by the use of an inverse technique to determine the relative hydraulic conductivity along travel paths. In this way, the spatial variability of the hydraulic conductivity corresponding to streamline convergence and divergence may be included in the analysis. 22 references, 11 figures, 1 table

  10. Uncertainty Evaluation of Residential Central Air-conditioning Test System

    Li, Haoxue

    2018-04-01

    According to national standards, property tests of air-conditioning are required. However, test results could be influenced by the precision of apparatus or measure errors. Therefore, uncertainty evaluation of property tests should be conducted. In this paper, the uncertainties are calculated on the property tests of Xinfei13.6 kW residential central air-conditioning. The evaluation result shows that the property tests are credible.

  11. A Quantitative Measure For Evaluating Project Uncertainty Under Variation And Risk Effects

    A. Chenarani

    2017-10-01

    Full Text Available The effects of uncertainty on a project and the risk event as the consequence of uncertainty are analyzed. The uncertainty index is proposed as a quantitative measure for evaluating the uncertainty of a project. This is done by employing entropy as the indicator of system disorder and lack of information. By employing this index, the uncertainty of each activity and its increase due to risk effects as well as project uncertainty changes as a function of time can be assessed. The results are implemented and analyzed for a small turbojet engine development project as the case study. The results of this study can be useful for project managers and other stakeholders for selecting the most effective risk management and uncertainty controlling method.

  12. Evaluation of advanced coal gasification combined-cycle systems under uncertainty

    Frey, H.C.; Rubin, E.S.

    1992-01-01

    Advanced integrated gasification combined cycle (IGCC) systems have not been commercially demonstrated, and uncertainties remain regarding their commercial-scale performance and cost. Therefore, a probabilistic evaluation method has been developed and applied to explicitly consider these uncertainties. The insights afforded by this method are illustrated for an IGCC design featuring a fixed-bed gasifier and a hot gas cleanup system. Detailed case studies are conducted to characterize uncertainties in key measures of process performance and cost, evaluate design trade-offs under uncertainty, identify research priorities, evaluate the potential benefits of additional research, compare results for different uncertainty assumptions, and compare the advanced IGCC system to a conventional system under uncertainty. The implications of probabilistic results for research planning and technology selection are discussed in this paper

  13. Study of reactivity feedbacks in a sodium-cooled fast reactor: new methodology based on perturbation theory for evaluating neutronic uncertainties

    Bouret, Cyrille

    2014-01-01

    Fast reactors (FR) can give value to the plutonium produced by the existing light water reactors and allow the transmutation of a significant part of the final nuclear waste. These features offer industrial prospects for this technology and new projects are currently studied in the world such as ASTRID prototype in France. Future FRs will have also to satisfy new requirements in terms of competitiveness, safety and reliability. In this context, the new core concept envisaged for ASTRID incorporate innovative features that improve the safety of the reactor in case of accident. The proposed design achieves a sodium voiding effect close to zero: it includes a fertile plate in the middle of the core and a sodium plenum in the upper part in order to increase the neutron leakage in case of sodium voiding. This heterogeneous design represents a challenge for the calculation tools and methods used so far to evaluate the neutronic parameters in traditional homogeneous cores. These methods have been improved over the thesis to rigorously treat the neutron streaming, especially at the mediums interfaces. These enhancements have consisted in the development of a specific analysis methodology based on perturbation theory and using a modern three dimensional Sn transport solver. This work has allowed on the one hand, to reduce the bias on static neutronic parameters in comparison with Monte Carlo methods, and, on the other hand, to obtain more accurate spatial distributions of neutronic effects including the reactivity feedback coefficients used for transient analysis. The analysis of the core behavior during transients has also allowed estimating the impact of reactivity feedback coefficients assessment improvements. In conjunction with this work, innovative methods based on the evaluation of local sensitivities coefficients have been proposed to assess the uncertainties associated to local reactivity effects. These uncertainties include the correlations between the different

  14. Evaluation of uncertainties in the calibration of radiation survey meter

    Potiens, M.P.A.; Santos, G.P.

    2006-01-01

    In order to meet the requirements of ISO 17025, the quantification of the expanded uncertainties of experimental data in the calibration of survey meters must be carried out using well defined concepts, like those expressed in the 'ISO-Guide to the Expression of Uncertainty in Measurement'. The calibration procedure of gamma ray survey meters involves two values that have to get their uncertainties clearly known: measurements of the instrument under calibration and the conventional true values of a quantity. Considering the continuous improvement of the calibration methods and set-ups, it is necessary to evaluate periodically the involved uncertainties in the procedures. In this work it is shown how the measurement uncertainties of an individual calibration can be estimated and how it can be generalized to be valid for others radiation survey meters. (authors)

  15. Uncertainty Evaluation with Multi-Dimensional Model of LBLOCA in OPR1000 Plant

    Kim, Jieun; Oh, Deog Yeon; Seul, Kwang-Won; Lee, Jin Ho [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    KINS has used KINS-REM (KINS-Realistic Evaluation Methodology) which developed for Best- Estimate (BE) calculation and uncertainty quantification for regulatory audit. This methodology has been improved continuously by numerous studies, such as uncertainty parameters and uncertainty ranges. In this study, to evaluate the applicability of improved KINS-REM for OPR1000 plant, uncertainty evaluation with multi-dimensional model for confirming multi-dimensional phenomena was conducted with MARS-KS code. In this study, the uncertainty evaluation with multi- dimensional model of OPR1000 plant was conducted for confirming the applicability of improved KINS- REM The reactor vessel modeled using MULTID component of MARS-KS code, and total 29 uncertainty parameters were considered by 124 sampled calculations. Through 124 calculations using Mosaique program with MARS-KS code, peak cladding temperature was calculated and final PCT was determined by the 3rd order Wilks' formula. The uncertainty parameters which has strong influence were investigated by Pearson coefficient analysis. They were mostly related with plant operation and fuel material properties. Evaluation results through the 124 calculations and sensitivity analysis show that improved KINS-REM could be reasonably applicable for uncertainty evaluation with multi-dimensional model calculations of OPR1000 plants.

  16. Sensitivity, uncertainty assessment, and target accuracies related to radiotoxicity evaluation

    Palmiotti, G.; Salvatores, M.; Hill, R.N.

    1994-01-01

    Time-dependent sensitivity techniques, which have been used in the past for standard reactor applications, are adapted to calculate the impact of data uncertainties and to estimate target data accuracies in radiotoxicity evaluations. The methodology is applied to different strategies of radioactive waste management connected with the European Fast Reactor and the Integral Fast Reactor fuel cycles. Results are provided in terms of sensitivity coefficients of basic data (cross sections and decay constants), uncertainties of global radiotoxicity at different times of storing after discharge, and target data accuracies needed to satisfy maximum uncertainty limits

  17. Evaluation of uncertainty sources and propagation from irradiance sensors to PV yield

    Mariottini, Francesco; Gottschalg, Ralph; Betts, Tom; Zhu, Jiang

    2018-01-01

    This work quantifies the uncertainties of a pyranometer. Sensitivity to errors is analysed regarding the effects generated by adopting different time resolutions. Estimation of irradiance measurand and error is extended throughout an annual data set. This study represents an attempt to provide a more exhaustive overview of both systematic (i.e. physical) and random uncertainties in the evaluation of pyranometer measurements. Starting from expanded uncertainty in a monitored ...

  18. Review of studies related to uncertainty in risk analsis

    Rish, W.R.; Marnicio, R.J.

    1988-08-01

    The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented

  19. Review of studies related to uncertainty in risk analsis

    Rish, W.R.; Marnicio, R.J.

    1988-08-01

    The Environmental Protection Agency's Office of Radiation Programs (ORP) is responsible for regulating on a national level the risks associated with technological sources of ionizing radiation in the environment. A critical activity of the ORP is analyzing and evaluating risk. The ORP believes that the analysis of uncertainty should be an integral part of any risk assessment; therefore, the ORP has initiated a project to develop framework for the treatment of uncertainty in risk analysis. Summaries of recent studies done in five areas of study are presented.

  20. Economic evaluation of private power production under uncertainties

    Weiguo Xing; Wu, F.F. [University of Hong Kong (China). Centre for Electrical Energy Systems

    2003-02-01

    Private power production is becoming an increasingly important source of electricity generation. In developing countries, build-operate-transfer (BOT) arrangement has emerged as a dominant form of private investment. Pricing private power production at its avoided cost is the breakeven point for the utility in economic evaluation, and uncertainties must be taken into account. In this paper, an approach of calculating the breakeven cost to the utility of a BOT power plant whose contract lasts for 10-25 years is proposed. The proposed approach requires the computation of production costs from long-term generation expansion planning (GEP) under future uncertainties. To facilitate the inclusion of constraints introduced by BOT plants in GEP and uncertainties, a genetic algorithm method is utilized in GEP. The breakeven cost is a useful measure in the economic evaluation of BOT power plants. An example is presented to illustrate the economic evaluation of BOT plants using the concept of breakeven cost.(author)

  1. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  2. Evaluation of peaking factors uncertainty for CASMO-3

    Kim, Kang Suk; Song, Jae Seung; Kim, Yong Rae; Ji, Seong Kyun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-02-01

    This document evaluates the pin-to-box factor uncertainty based on using the CASMO-3 with 40-group J-library. Five CE criticals performed by Westinghouse, two by B and W and four RPI criticals were analyzed, using cross sections by CASMO-3. DOT was used for the core calculation. THis is one hof series of efforts to verify ADONIS procedure which is a new core design package under development by KAERI. The expected outcome of this analysis is CASMO-3 pin peak uncertainty applicable to CE type fuel assembly design. The evaluated uncertainty of peaking factors for CASMO-3 was 1.863%. 21 tabs., 23 figs., 12 refs. (Author) .new.

  3. Uncertainties

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  4. Uncertainty

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  5. Evaluating the uncertainty of input quantities in measurement models

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in

  6. Contribution to uncertainties evaluation for fast reactors neutronic cross sections

    Privas, Edwin

    2015-01-01

    The thesis has been motivated by a wish to increase the uncertainty knowledge on nuclear data, for safety criteria. It aims the cross sections required by core calculation for sodium fast reactors (SFR), and new tools to evaluate its.The main objective of this work is to provide new tools in order to create coherent evaluated files, with reliable and mastered uncertainties. To answer those problematic, several methods have been implemented within the CONRAD code, which is developed at CEA of Cadarache. After a summary of all the elements required to understand the evaluation world, stochastic methods are presented in order to solve the Bayesian inference. They give the evaluator more information about probability density and they also can be used as validation tools. The algorithms have been successfully tested, despite long calculation time. Then, microscopic constraints have been implemented in CONRAD. They are defined as new information that should be taken into account during the evaluation process. An algorithm has been developed in order to solve, for example, continuity issues between two energy domains, with the Lagrange multiplier formalism. Another method is given by using a marginalization procedure, in order to either complete an existing evaluation with new covariance or add systematic uncertainty on an experiment described by two theories. The algorithms are well performed along examples, such the 238 U total cross section. The last parts focus on the integral data feedback, using methods of integral data assimilation to reduce the uncertainties on cross sections. This work ends with uncertainty reduction on key nuclear reactions, such the capture and fission cross sections of 238 U and 239 Pu, thanks to PROFIL and PROFIL-2 experiments in Phenix and the Jezebel benchmark. (author) [fr

  7. Maximum respiratory pressure measuring system : calibration and evaluation of uncertainty

    Ferreira, J.L.; Pereira, N.C.; Oliveira Júnior, M.; Vasconcelos, F.H.; Parreira, V.F.; Tierra-Criollo, C.J.

    2010-01-01

    The objective of this paper is to present a methodology for the evaluation of uncertainties in the measurements results obtained during the calibration of a digital manovacuometer prototype (DM) with a load cell sensor pressure device incorporated. Calibration curves were obtained for both pressure

  8. Uncertainty analysis in estimating Japanese ingestion of global fallout Cs-137 using health risk evaluation model

    Shimada, Yoko; Morisawa, Shinsuke

    1998-01-01

    Most of model estimation of the environmental contamination includes some uncertainty associated with the parameter uncertainty in the model. In this study, the uncertainty was analyzed in a model for evaluating the ingestion of radionuclide caused by the long-term global low-level radioactive contamination by using various uncertainty analysis methods: the percentile estimate, the robustness analysis and the fuzzy estimate. The model is mainly composed of five sub-models, which include their own uncertainty; we also analyzed the uncertainty. The major findings obtained in this study include that the possibility of the discrepancy between predicted value by the model simulation and the observed data is less than 10%; the uncertainty of the predicted value is higher before 1950 and after 1980; the uncertainty of the predicted value can be reduced by decreasing the uncertainty of some environmental parameters in the model; the reliability of the model can definitively depend on the following environmental factors: direct foliar absorption coefficient, transfer factor of radionuclide from stratosphere down to troposphere, residual rate by food processing and cooking, transfer factor of radionuclide in ocean and sedimentation in ocean. (author)

  9. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes

  10. Large break LOCA uncertainty evaluation and comparison with conservative calculation

    Glaeser, H.G.

    2004-01-01

    The first formulation of the USA Code of Federal Regulations (CFR) 10CFR50 with applicable sections specific to NPP licensing requirements was released 1976. Over a decade later 10CFR 50.46 allowed the use of BE codes instead of conservative code models but uncertainties have to be identified and quantified. Guidelines were released that described interpretations developed over the intervening years that are applicable. Other countries established similar conservative procedures and acceptance criteria. Because conservative methods were used to calculate the peak values of key parameters, such as peak clad temperature (PCT), it was always acknowledged that a large margin, between the 'conservative' calculated value and the 'true' value, existed. Beside USA, regulation in other countries, like Germany, for example, allowed that the state of science and technology is applied in licensing. I.e. the increase of experimental evidence and progress in code development during time could be used. There was no requirement to apply a pure evaluation methodology with licensed assumptions and frozen codes. The thermal-hydraulic system codes became more and more best-estimate codes based on comprehensive validation. This development was and is possible because the rules and guidelines provide the necessary latitude to consider further development of safety technology. Best estimate codes are allowed to be used in licensing in combination with conservative initial and boundary conditions. However, uncertainty quantification is not required. Since some of the initial and boundary conditions are more conservative compared with those internationally used (e.g. 106% reactor power instead 102%, a single failure plus a non-availability due to preventive maintenance is assumed, etc.) it is claimed that the uncertainties of code models are covered. Since many utilities apply for power increase, calculation results come closer to some licensing criteria. The situation in German licensing

  11. Evaluating variability and uncertainty in radiological impact assessment using SYMBIOSE

    Simon-Cornu, M.; Beaugelin-Seiller, K.; Boyer, P.; Calmon, P.; Garcia-Sanchez, L.; Mourlon, C.; Nicoulaud, V.; Sy, M.; Gonze, M.A.

    2015-01-01

    SYMBIOSE is a modelling platform that accounts for variability and uncertainty in radiological impact assessments, when simulating the environmental fate of radionuclides and assessing doses to human populations. The default database of SYMBIOSE is partly based on parameter values that are summarized within International Atomic Energy Agency (IAEA) documents. To characterize uncertainty on the transfer parameters, 331 Probability Distribution Functions (PDFs) were defined from the summary statistics provided within the IAEA documents (i.e. sample size, minimal and maximum values, arithmetic and geometric means, standard and geometric standard deviations) and are made available as spreadsheet files. The methods used to derive the PDFs without complete data sets, but merely the summary statistics, are presented. Then, a simple case-study illustrates the use of the database in a second-order Monte Carlo calculation, separating parametric uncertainty and inter-individual variability. - Highlights: • Parametric uncertainty in radioecology was derived from IAEA documents. • 331 Probability Distribution Functions were defined for transfer parameters. • Parametric uncertainty and inter-individual variability were propagated

  12. Evaluation of uncertainty in the measurement of environmental electromagnetic fields

    Vulevic, B.; Osmokrovic, P.

    2010-01-01

    With regard to Non-ionising radiation protection, the relationship between human exposure to electromagnetic fields and health is controversial. Electromagnetic fields have become omnipresent in the daily environment. This paper assesses the problem of how to compare a measurement result with a limit fixed by the standard for human exposure to electric, magnetic and electromagnetic fields (0 Hz-300 GHz). The purpose of the paper is an appropriate representation of the basic information about evaluation of measurement uncertainty. (authors)

  13. Study on Uncertainty and Contextual Modelling

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 1 (2007), s. 12-15 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Knowledge * contextual modelling * temporal modelling * uncertainty * knowledge management Subject RIV: BD - Theory of Information

  14. Uncertainty Evaluation of a Postulated LBLOCA for APR+ using KINS Realistic Evaluation Methodology and MARS-KS

    Hwang, Min Jeong; Marigomena, Ralph; Yoo, Tae Ho; Kim, Y. S.; Sim, S. K. [Environment and Energy Technology, Inc., Daejeon (Korea, Republic of); Bang, Young Seok [KINS, Daejeon (Korea, Republic of)

    2014-05-15

    As a part of the regulatory safety research, Korea Institute of Nuclear Safety(KINS) also developed a best estimate safety analysis regulatory audit code, MARS-KS, to realistically predict and better understand the physical phenomena of the design basis accidents. KINS improved uncertainty propagation methodology using MARS-KS and applied the improved uncertainty evaluation method for the Shinkori Units 3 and 4 LBLOC. This study is to evaluate the uncertainty propagation of a postulated LBLOCA and quantify the safety margin using KINS-REM and MARS-KS code for the APR+ (Advanced Pressurizer Reactor Plus) Standard Safety Analysis Report(SSAR) which is under regulatory review by the KINS for its design approval. KINS-REM LBLOCA realistic evaluation methodology was used for the regulatory assessment of the APR+ LBLOCA using MARS-KS to evaluate the uncertainty propagation of the uncertainty variables as well as to assess the safety margin during the limiting case of the APR+ double ended guillotine cold leg LBLOCA. Uncertainty evaluation for the APR+ LBLOCA shows that the reflood PCT with upper limit of 95% probability at 95% confidence level is 1363.2 K and is higher than the blowdown PCT95/95 of 1275.3 K. The result shows that the current evaluation of APR+ LBLOCA PCT is within the acceptance criteria of 1477 K ECCS.

  15. CITRICULTURE ECONOMIC AND FINANCIAL EVALUATION UNDER CONDITIONS OF UNCERTAINTY

    DANILO SIMÕES

    2015-12-01

    Full Text Available ABSTRACT The citriculture consists in several environmental risks, as weather changes and pests, and also consists in considerable financial risk, mainly due to the period ofreturn on the initial investment. This study was motivated by the need to assess the risks of a business activity such as citriculture. Our objective was to build a stochastic simulation model to achieve the economic and financial analysis of an orange producer in the Midwest region of the state of Sao Paulo, under conditions of uncertainty. The parameters used were the Net Present Value (NPV, the Modified Internal Rate of Return(MIRR, and the Discounted Payback. To evaluate the risk conditions we built a probabilistic model of pseudorandom numbers generated with Monte Carlo method. The results showed that the activity analyzed provides a risk of 42.8% to reach a NPV negative; however, the yield assessed by MIRR was 7.7%, higher than the yield from the reapplication of the positive cash flows. The financial investment pays itself after the fourteenth year of activity.

  16. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-01-01

    During the NRC's Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined

  17. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-04-01

    During the NRC`s Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined.

  18. Research on uncertainty evaluation measure and method of voltage sag severity

    Liu, X. N.; Wei, J.; Ye, S. Y.; Chen, B.; Long, C.

    2018-01-01

    Voltage sag is an inevitable serious problem of power quality in power system. This paper focuses on a general summarization and reviews on the concepts, indices and evaluation methods about voltage sag severity. Considering the complexity and uncertainty of influencing factors, damage degree, the characteristics and requirements of voltage sag severity in the power source-network-load sides, the measure concepts and their existing conditions, evaluation indices and methods of voltage sag severity have been analyzed. Current evaluation techniques, such as stochastic theory, fuzzy logic, as well as their fusion, are reviewed in detail. An index system about voltage sag severity is provided for comprehensive study. The main aim of this paper is to propose thought and method of severity research based on advanced uncertainty theory and uncertainty measure. This study may be considered as a valuable guide for researchers who are interested in the domain of voltage sag severity.

  19. Determination of a PWR key neutron parameters uncertainties and conformity studies applications

    Bernard, D.

    2002-01-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and lifetime. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimised. (author)

  20. Evaluation of uncertainties in the calibration of radiation personal monitor with Cesium-137 source

    Mirapalheta, Tatiane; Alexandre, Anderson; Costa, Camila; Batista, Gilmar; Paulino, Thyago; Albuquerque, Marcos; Universidade do Estado do Rio de Janeiro

    2016-01-01

    This work shows the entire calibration process of an individual monitor, focusing on radiation protection, in health, correlating these measures associated uncertainties. The results show an expanded uncertainty of 5.81% for dose rate measurements and an expanded uncertainty of 5.61% for integrated dose measurements, these uncertainties have been evaluated the type A and type B with its components. (author)

  1. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  2. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  3. Calculation of the uncertainty of HP (10) evaluation for a thermoluminescent dosimetry system

    Ferreira, M.S.; Silva, E.R.; Mauricio, C.L.P.

    2016-01-01

    Full interpretation of dose assessment only can be performed when the uncertainty of the measurement is known. The aim of this study is to calculate the uncertainty of the TL dosimetry system of the LDF/IRD for evaluation of H P (10) for photons. It has been done by experimental measurements, extraction of information from documents and calculation of uncertainties based on ISO GUM. Energy and angular dependence is the most important source to the combined u c (y) and expanded (U) uncertainty. For 10 mSv, it was obtained u c (y) = 1,99 mSv and U = 3,98 mSv for 95% of coverage interval. (author)

  4. Uncertainty evaluation for ordinary least-square fitting with arbitrary order polynomial in joule balance method

    You, Qiang; Xu, JinXin; Wang, Gang; Zhang, Zhonghua

    2016-01-01

    The ordinary least-square fitting with polynomial is used in both the dynamic phase of the watt balance method and the weighting phase of joule balance method but few researches have been conducted to evaluate the uncertainty of the fitting data in the electrical balance methods. In this paper, a matrix-calculation method for evaluating the uncertainty of the polynomial fitting data is derived and the properties of this method are studied by simulation. Based on this, another two derived methods are proposed. One is used to find the optimal fitting order for the watt or joule balance methods. Accuracy and effective factors of this method are experimented with simulations. The other is used to evaluate the uncertainty of the integral of the fitting data for joule balance, which is demonstrated with an experiment from the NIM-1 joule balance. (paper)

  5. On Evaluation of Recharge Model Uncertainty: a Priori and a Posteriori

    Ming Ye; Karl Pohlmann; Jenny Chapman; David Shafer

    2006-01-01

    Hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Hydrologic analyses typically rely on a single conceptual-mathematical model, which ignores conceptual model uncertainty and may result in bias in predictions and under-estimation of predictive uncertainty. This study is to assess conceptual model uncertainty residing in five recharge models developed to date by different researchers based on different theories for Nevada and Death Valley area, CA. A recently developed statistical method, Maximum Likelihood Bayesian Model Averaging (MLBMA), is utilized for this analysis. In a Bayesian framework, the recharge model uncertainty is assessed, a priori, using expert judgments collected through an expert elicitation in the form of prior probabilities of the models. The uncertainty is then evaluated, a posteriori, by updating the prior probabilities to estimate posterior model probability. The updating is conducted through maximum likelihood inverse modeling by calibrating the Death Valley Regional Flow System (DVRFS) model corresponding to each recharge model against observations of head and flow. Calibration results of DVRFS for the five recharge models are used to estimate three information criteria (AIC, BIC, and KIC) used to rank and discriminate these models. Posterior probabilities of the five recharge models, evaluated using KIC, are used as weights to average head predictions, which gives posterior mean and variance. The posterior quantities incorporate both parametric and conceptual model uncertainties

  6. Evaluation of thermal-hydraulic parameter uncertainties in a TRIGA research reactor

    Mesquita, Amir Z.; Costa, Antonio C.L.; Ladeira, Luiz C.D.; Rezende, Hugo C.; Palma, Daniel A.P.

    2015-01-01

    Experimental studies had been performed in the TRIGA Research Nuclear Reactor of CDTN/CNEN to find out the its thermal hydraulic parameters. Fuel to coolant heat transfer patterns must be evaluated as function of the reactor power in order to assess the thermal hydraulic performance of the core. The heat generated by nuclear fission in the reactor core is transferred from fuel elements to the cooling system through the fuel-cladding (gap) and the cladding to coolant interfaces. As the reactor core power increases the heat transfer regime from the fuel cladding to the coolant changes from single-phase natural convection to subcooled nucleate boiling. This paper presents the uncertainty analysis in the results of the thermal hydraulics experiments performed. The methodology used to evaluate the propagation of uncertainty in the results was done based on the pioneering article of Kline and McClintock, with the propagation of uncertainties based on the specification of uncertainties in various primary measurements. The uncertainty analysis on thermal hydraulics parameters of the CDTN TRIGA fuel element is determined, basically, by the uncertainty of the reactor's thermal power. (author)

  7. Uncertainty Evaluation of the SFR Subchannel Thermal-Hydraulic Modeling Using a Hot Channel Factors Analysis

    Choi, Sun Rock; Cho, Chung Ho; Kim, Sang Ji

    2011-01-01

    In an SFR core analysis, a hot channel factors (HCF) method is most commonly used to evaluate uncertainty. It was employed to the early design such as the CRBRP and IFR. In other ways, the improved thermal design procedure (ITDP) is able to calculate the overall uncertainty based on the Root Sum Square technique and sensitivity analyses of each design parameters. The Monte Carlo method (MCM) is also employed to estimate the uncertainties. In this method, all the input uncertainties are randomly sampled according to their probability density functions and the resulting distribution for the output quantity is analyzed. Since an uncertainty analysis is basically calculated from the temperature distribution in a subassembly, the core thermal-hydraulic modeling greatly affects the resulting uncertainty. At KAERI, the SLTHEN and MATRA-LMR codes have been utilized to analyze the SFR core thermal-hydraulics. The SLTHEN (steady-state LMR core thermal hydraulics analysis code based on the ENERGY model) code is a modified version of the SUPERENERGY2 code, which conducts a multi-assembly, steady state calculation based on a simplified ENERGY model. The detailed subchannel analysis code MATRA-LMR (Multichannel Analyzer for Steady-State and Transients in Rod Arrays for Liquid Metal Reactors), an LMR version of MATRA, was also developed specifically for the SFR core thermal-hydraulic analysis. This paper describes comparative studies for core thermal-hydraulic models. The subchannel analysis and a hot channel factors based uncertainty evaluation system is established to estimate the core thermofluidic uncertainties using the MATRA-LMR code and the results are compared to those of the SLTHEN code

  8. Evaluation of uncertainties of key neutron parameters of PWR-type reactors with slab fuel, application to neutronic conformity

    Bernard, D.

    2001-12-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and life-time. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then, neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimized. (author)

  9. New product development projects evaluation under time uncertainty

    Thiago Augusto de Oliveira Silva

    2009-12-01

    Full Text Available The development time is one of the key factors that contribute to the new product development success. In spite of that, the impact of the time uncertainty on the development has been not fully exploited, as far as decision supporting models to evaluate this kind of projects is concerned. In this context, the objective of the present paper is to evaluate the development process of new technologies under time uncertainty. We introduce a model which captures this source of uncertainty and develop an algorithm to evaluate projects that incorporates Monte Carlo Simulation and Dynamic Programming. The novelty in our approach is to thoroughly blend the stochastic time with a formal approach to the problem, which preserves the Markov property. We base our model on the distinction between the decision epoch and the stochastic time. We discuss and illustrate the applicability of our model through an empirical example.O tempo de desenvolvimento é um dos fatores-chave que contribuem para o sucesso do desenvolvimento de novos produtos. Apesar disso, o impacto da incerteza de tempo no desenvolvimento tem sido pouco considerado em modelos de avaliação e valoração deste tipo de projetos. Neste contexto, este trabalho tem como objetivo avaliar projetos de desenvolvimento de novas tecnologias mediante o tempo incerto. Introduzimos um modelo capaz de captar esta fonte de incerteza e desenvolvemos um algoritmo para a valoração do projeto que integra Simulação de Monte Carlo e Programação Dinâmica. A novidade neste trabalho é conseguir integrar meticulosamente o tempo estocástico a uma estrutura formal para tomada de decisão que preserva a propriedade de Markov. O principal ponto para viabilizar este fato é distinção entre o momento de revisão e o tempo estocástico. Ilustramos e discutimos a aplicabilidade deste modelo por meio de um exemplo empírico.

  10. Evaluating Sketchiness as a Visual Variable for the Depiction of Qualitative Uncertainty

    Boukhelifa, Nadia; Bezerianos, Anastasia; Isenberg, Tobias; Fekete, Jean-Daniel

    2012-01-01

    We report on results of a series of user studies on the perception of four visual variables that are commonly used in the literature to depict uncertainty. To the best of our knowledge, we provide the first formal evaluation of the use of these variables to facilitate an easier reading of

  11. Methodology evaluation of innovative projects under risk and uncertainty

    2012-09-01

    Full Text Available This article deals with problems connected with the assessment of innovative projects in the context of risk and uncertainty, topical issues of evaluation of innovative projects at the present stage of development of the Russian economy. By the example of the solution of the "crossing the river" is considering the possibility of using hierarchical models to address it. In what follows, and compares the priorities of different groups of factors are given by calculating the overall costs and benefits. The paper provides a rationale for combined use of four aspects: the beneficial aspects of the decision (the benefits and opportunities and negative (costs and risks that may lead to the decision in question.

  12. Uncertainty evaluation in the self-alignment test of the upper plate of a press

    Lourenço, Alexandre S; E Sousa, J Alves

    2015-01-01

    This paper describes a method to evaluate uncertainty of the self-alignment test of the upper plate of a press according to EN 12390-4:2000. The method, the algorithms and the sources of uncertainty are described

  13. A comparative experimental evaluation of uncertainty estimation methods for two-component PIV

    Boomsma, Aaron; Bhattacharya, Sayantan; Troolin, Dan; Pothos, Stamatios; Vlachos, Pavlos

    2016-09-01

    Uncertainty quantification in planar particle image velocimetry (PIV) measurement is critical for proper assessment of the quality and significance of reported results. New uncertainty estimation methods have been recently introduced generating interest about their applicability and utility. The present study compares and contrasts current methods, across two separate experiments and three software packages in order to provide a diversified assessment of the methods. We evaluated the performance of four uncertainty estimation methods, primary peak ratio (PPR), mutual information (MI), image matching (IM) and correlation statistics (CS). The PPR method was implemented and tested in two processing codes, using in-house open source PIV processing software (PRANA, Purdue University) and Insight4G (TSI, Inc.). The MI method was evaluated in PRANA, as was the IM method. The CS method was evaluated using DaVis (LaVision, GmbH). Utilizing two PIV systems for high and low-resolution measurements and a laser doppler velocimetry (LDV) system, data were acquired in a total of three cases: a jet flow and a cylinder in cross flow at two Reynolds numbers. LDV measurements were used to establish a point validation against which the high-resolution PIV measurements were validated. Subsequently, the high-resolution PIV measurements were used as a reference against which the low-resolution PIV data were assessed for error and uncertainty. We compared error and uncertainty distributions, spatially varying RMS error and RMS uncertainty, and standard uncertainty coverages. We observed that qualitatively, each method responded to spatially varying error (i.e. higher error regions resulted in higher uncertainty predictions in that region). However, the PPR and MI methods demonstrated reduced uncertainty dynamic range response. In contrast, the IM and CS methods showed better response, but under-predicted the uncertainty ranges. The standard coverages (68% confidence interval) ranged from

  14. A comparative experimental evaluation of uncertainty estimation methods for two-component PIV

    Boomsma, Aaron; Troolin, Dan; Pothos, Stamatios; Bhattacharya, Sayantan; Vlachos, Pavlos

    2016-01-01

    Uncertainty quantification in planar particle image velocimetry (PIV) measurement is critical for proper assessment of the quality and significance of reported results. New uncertainty estimation methods have been recently introduced generating interest about their applicability and utility. The present study compares and contrasts current methods, across two separate experiments and three software packages in order to provide a diversified assessment of the methods. We evaluated the performance of four uncertainty estimation methods, primary peak ratio (PPR), mutual information (MI), image matching (IM) and correlation statistics (CS). The PPR method was implemented and tested in two processing codes, using in-house open source PIV processing software (PRANA, Purdue University) and Insight4G (TSI, Inc.). The MI method was evaluated in PRANA, as was the IM method. The CS method was evaluated using DaVis (LaVision, GmbH). Utilizing two PIV systems for high and low-resolution measurements and a laser doppler velocimetry (LDV) system, data were acquired in a total of three cases: a jet flow and a cylinder in cross flow at two Reynolds numbers. LDV measurements were used to establish a point validation against which the high-resolution PIV measurements were validated. Subsequently, the high-resolution PIV measurements were used as a reference against which the low-resolution PIV data were assessed for error and uncertainty. We compared error and uncertainty distributions, spatially varying RMS error and RMS uncertainty, and standard uncertainty coverages. We observed that qualitatively, each method responded to spatially varying error (i.e. higher error regions resulted in higher uncertainty predictions in that region). However, the PPR and MI methods demonstrated reduced uncertainty dynamic range response. In contrast, the IM and CS methods showed better response, but under-predicted the uncertainty ranges. The standard coverages (68% confidence interval) ranged from

  15. Evaluation of kinetic uncertainty in numerical models of petroleum generation

    Peters, K.E.; Walters, C.C.; Mankiewicz, P.J.

    2006-01-01

    Oil-prone marine petroleum source rocks contain type I or type II kerogen having Rock-Eval pyrolysis hydrogen indices greater than 600 or 300-600 mg hydrocarbon/g total organic carbon (HI, mg HC/g TOC), respectively. Samples from 29 marine source rocks worldwide that contain mainly type II kerogen (HI = 230-786 mg HC/g TOC) were subjected to open-system programmed pyrolysis to determine the activation energy distributions for petroleum generation. Assuming a burial heating rate of 1??C/m.y. for each measured activation energy distribution, the calculated average temperature for 50% fractional conversion of the kerogen in the samples to petroleum is approximately 136 ?? 7??C, but the range spans about 30??C (???121-151??C). Fifty-two outcrop samples of thermally immature Jurassic Oxford Clay Formation were collected from five locations in the United Kingdom to determine the variations of kinetic response for one source rock unit. The samples contain mainly type I or type II kerogens (HI = 230-774 mg HC/g TOC). At a heating rate of 1??C/m.y., the calculated temperatures for 50% fractional conversion of the Oxford Clay kerogens to petroleum differ by as much as 23??C (127-150??C). The data indicate that kerogen type, as defined by hydrogen index, is not systematically linked to kinetic response, and that default kinetics for the thermal decomposition of type I or type II kerogen can introduce unacceptable errors into numerical simulations. Furthermore, custom kinetics based on one or a few samples may be inadequate to account for variations in organofacies within a source rock. We propose three methods to evaluate the uncertainty contributed by kerogen kinetics to numerical simulations: (1) use the average kinetic distribution for multiple samples of source rock and the standard deviation for each activation energy in that distribution; (2) use source rock kinetics determined at several locations to describe different parts of the study area; and (3) use a weighted

  16. Scientific Uncertainties in Climate Change Detection and Attribution Studies

    Santer, B. D.

    2017-12-01

    It has been claimed that the treatment and discussion of key uncertainties in climate science is "confined to hushed sidebar conversations at scientific conferences". This claim is demonstrably incorrect. Climate change detection and attribution studies routinely consider key uncertainties in observational climate data, as well as uncertainties in model-based estimates of natural variability and the "fingerprints" in response to different external forcings. The goal is to determine whether such uncertainties preclude robust identification of a human-caused climate change fingerprint. It is also routine to investigate the impact of applying different fingerprint identification strategies, and to assess how detection and attribution results are impacted by differences in the ability of current models to capture important aspects of present-day climate. The exploration of the uncertainties mentioned above will be illustrated using examples from detection and attribution studies with atmospheric temperature and moisture.

  17. Measurement Uncertainty Evaluation in Dimensional X-ray Computed Tomography Using the Bootstrap Method

    Hiller, Jochen; Genta, Gianfranco; Barbato, Giulio

    2014-01-01

    measurement processes, e.g., with tactile systems, also due to factors related to systematic errors, mainly caused by specific CT image characteristics. In this paper we propose a simulation-based framework for measurement uncertainty evaluation in dimensional CT using the bootstrap method. In a case study...... the problem concerning measurement uncertainties was addressed with bootstrap and successfully applied to ball-bar CT measurements. Results obtained enabled extension to more complex shapes such as actual industrial components as we show by tests on a hollow cylinder workpiece....

  18. Thermal-Hydraulic Analysis for SBLOCA in OPR1000 and Evaluation of Uncertainty for PSA

    Kim, Tae Jin; Park, Goon Cherl

    2012-01-01

    Probabilistic Safety assessment (PSA) is a mathematical tool to evaluate numerical estimates of risk for nuclear power plants (NPPs). But PSA has the problems about quality and reliability since the quantification of uncertainties from thermal hydraulic (TH) analysis has not been included in the quantification of overall uncertainties in PSA. From the former research, it is proved that the quantification of uncertainties from best-estimate LBLOCA analysis can improve the PSA quality by modifying the core damage frequency (CDF) from the existing PSA report. Basing on the similar concept, this study considers the quantification of SBLOCA analysis results. In this study, however, operator error parameters are also included in addition to the phenomenon parameters which are considered in LBLOCA analysis

  19. The characterisation and evaluation of uncertainty in probabilistic risk analysis

    Parry, G.W.; Winter, P.W.

    1980-10-01

    The sources of uncertainty in probabilistic risk analysis are discussed using the event/fault tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which is, at present, unquantifiable, using either classical or Bayesian statistics. It is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events and a short review is given with some discussion on the representation of ignorance. (author)

  20. Uncertainty evaluation of a modified elimination weighing for source preparation

    Cacais, F.L.; Loayza, V.M., E-mail: facacais@gmail.com [Instituto Nacional de Metrologia, Qualidade e Tecnologia, (INMETRO), Rio de Janeiro, RJ (Brazil); Delgado, J.U. [Instituto de Radioproteção e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Lab. de Metrologia das Radiações Ionizantes

    2017-07-01

    Some modification in elimination weighing method for radioactive source allowed correcting weighing results without non-linearity problems assign a uncertainty contribution for the correction of the same order of the mass of drop uncertainty and check weighing variability in series source preparation. This analysis has focused in knowing the achievable weighing accuracy and the uncertainty estimated by Monte Carlo method for a mass of a 20 mg drop was at maximum of 0.06%. (author)

  1. Evaluating Prognostics Performance for Algorithms Incorporating Uncertainty Estimates

    National Aeronautics and Space Administration — Uncertainty Representation and Management (URM) are an integral part of the prognostic system development.1As capabilities of prediction algorithms evolve, research...

  2. A method based on Monte Carlo simulations and voxelized anatomical atlases to evaluate and correct uncertainties on radiotracer accumulation quantitation in beta microprobe studies in the rat brain

    Pain, F.; Dhenain, M.; Gurden, H.; Routier, A. L.; Lefebvre, F.; Mastrippolito, R.; Lanièce, P.

    2008-10-01

    The β-microprobe is a simple and versatile technique complementary to small animal positron emission tomography (PET). It relies on local measurements of the concentration of positron-labeled molecules. So far, it has been successfully used in anesthetized rats for pharmacokinetics experiments and for the study of brain energetic metabolism. However, the ability of the technique to provide accurate quantitative measurements using 18F, 11C and 15O tracers is likely to suffer from the contribution of 511 keV gamma rays background to the signal and from the contribution of positrons from brain loci surrounding the locus of interest. The aim of the present paper is to provide a method of evaluating several parameters, which are supposed to affect the quantification of recordings performed in vivo with this methodology. We have developed realistic voxelized phantoms of the rat whole body and brain, and used them as input geometries for Monte Carlo simulations of previous β-microprobe reports. In the context of realistic experiments (binding of 11C-Raclopride to D2 dopaminergic receptors in the striatum; local glucose metabolic rate measurement with 18F-FDG and H2O15 blood flow measurements in the somatosensory cortex), we have calculated the detection efficiencies and corresponding contribution of 511 keV gammas from peripheral organs accumulation. We confirmed that the 511 keV gammas background does not impair quantification. To evaluate the contribution of positrons from adjacent structures, we have developed β-Assistant, a program based on a rat brain voxelized atlas and matrices of local detection efficiencies calculated by Monte Carlo simulations for several probe geometries. This program was used to calculate the 'apparent sensitivity' of the probe for each brain structure included in the detection volume. For a given localization of a probe within the brain, this allows us to quantify the different sources of beta signal. Finally, since stereotaxic accuracy is

  3. Modelling and propagation of uncertainties in the German Risk Study

    Hofer, E.; Krzykacz, B.

    1982-01-01

    Risk assessments are generally subject to uncertainty considerations. This is because of the various estimates that are involved. The paper points out those estimates in the so-called phase A of the German Risk Study, for which uncertainties were quantified. It explains the probabilistic models applied in the assessment to their impact on the findings of the study. Finally the resulting subjective confidence intervals of the study results are presented and their sensitivity to these probabilistic models is investigated

  4. Quantifying uncertainties in precipitation: a case study from Greece

    C. Anagnostopoulou

    2008-04-01

    Full Text Available The main objective of the present study was the examination and the quantification of the uncertainties in the precipitation time series over the Greek area, for a 42-year time period. The uncertainty index applied to the rainfall data is a combination (total of the departures of the rainfall season length, of the median data of the accumulated percentages and of the total amounts of rainfall. Results of the study indicated that all the stations are characterized, on an average basis, by medium to high uncertainty. The stations that presented an increasing rainfall uncertainty were the ones located mainly to the continental parts of the study region. From the temporal analysis of the uncertainty index, it was demonstrated that the greatest percentage of the years, for all the stations time-series, was characterized by low to high uncertainty (intermediate categories of the index. Most of the results of the uncertainty index for the Greek region are similar to the corresponding results of various stations all over the European region.

  5. Evaluation of seepage and discharge uncertainty in the middle Snake River, southwestern Idaho

    Wood, Molly S.; Williams, Marshall L.; Evetts, David M.; Vidmar, Peter J.

    2014-01-01

    The U.S. Geological Survey, in cooperation with the State of Idaho, Idaho Power Company, and the Idaho Department of Water Resources, evaluated seasonal seepage gains and losses in selected reaches of the middle Snake River, Idaho, during November 2012 and July 2013, and uncertainty in measured and computed discharge at four Idaho Power Company streamgages. Results from this investigation will be used by resource managers in developing a protocol to calculate and report Adjusted Average Daily Flow at the Idaho Power Company streamgage on the Snake River below Swan Falls Dam, near Murphy, Idaho, which is the measurement point for distributing water to owners of hydropower and minimum flow water rights in the middle Snake River. The evaluated reaches of the Snake River were from King Hill to Murphy, Idaho, for the seepage studies and downstream of Lower Salmon Falls Dam to Murphy, Idaho, for evaluations of discharge uncertainty. Computed seepage was greater than cumulative measurement uncertainty for subreaches along the middle Snake River during November 2012, the non-irrigation season, but not during July 2013, the irrigation season. During the November 2012 seepage study, the subreach between King Hill and C J Strike Dam had a meaningful (greater than cumulative measurement uncertainty) seepage gain of 415 cubic feet per second (ft3/s), and the subreach between Loveridge Bridge and C J Strike Dam had a meaningful seepage gain of 217 ft3/s. The meaningful seepage gain measured in the November 2012 seepage study was expected on the basis of several small seeps and springs present along the subreach, regional groundwater table contour maps, and results of regional groundwater flow model simulations. Computed seepage along the subreach from C J Strike Dam to Murphy was less than cumulative measurement uncertainty during November 2012 and July 2013; therefore, seepage cannot be quantified with certainty along this subreach. For the uncertainty evaluation, average

  6. Interlaboratory analytical performance studies; a way to estimate measurement uncertainty

    El¿bieta £ysiak-Pastuszak

    2004-09-01

    Full Text Available Comparability of data collected within collaborative programmes became the key challenge of analytical chemistry in the 1990s, including monitoring of the marine environment. To obtain relevant and reliable data, the analytical process has to proceed under a well-established Quality Assurance (QA system with external analytical proficiency tests as an inherent component. A programme called Quality Assurance in Marine Monitoring in Europe (QUASIMEME was established in 1993 and evolved over the years as the major provider of QA proficiency tests for nutrients, trace metals and chlorinated organic compounds in marine environment studies. The article presents an evaluation of results obtained in QUASIMEME Laboratory Performance Studies by the monitoring laboratory of the Institute of Meteorology and Water Management (Gdynia, Poland in exercises on nutrient determination in seawater. The measurement uncertainty estimated from routine internal quality control measurements and from results of analytical performance exercises is also presented in the paper.

  7. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    Gregory, Julie J.; Harper, Frederick T.

    1999-01-01

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry

  8. Probabilistic Accident Consequence Uncertainty - A Joint CEC/USNRC Study

    Gregory, Julie J.; Harper, Frederick T.

    1999-07-28

    The joint USNRC/CEC consequence uncertainty study was chartered after the development of two new probabilistic accident consequence codes, MACCS in the U.S. and COSYMA in Europe. Both the USNRC and CEC had a vested interest in expanding the knowledge base of the uncertainty associated with consequence modeling, and teamed up to co-sponsor a consequence uncertainty study. The information acquired from the study was expected to provide understanding of the strengths and weaknesses of current models as well as a basis for direction of future research. This paper looks at the elicitation process implemented in the joint study and discusses some of the uncertainty distributions provided by eight panels of experts from the U.S. and Europe that were convened to provide responses to the elicitation. The phenomenological areas addressed by the expert panels include atmospheric dispersion and deposition, deposited material and external doses, food chain, early health effects, late health effects and internal dosimetry.

  9. Evaluating uncertainties in regional climate simulations over South America at the seasonal scale

    Solman, Silvina A. [Centro de Investigaciones del Mar y la Atmosfera CIMA/CONICET-UBA, DCAO/FCEN, UMI-IFAECI/CNRS, CIMA-Ciudad Universitaria, Buenos Aires (Argentina); Pessacg, Natalia L. [Centro Nacional Patagonico (CONICET), Puerto Madryn, Chubut (Argentina)

    2012-07-15

    This work focuses on the evaluation of different sources of uncertainty affecting regional climate simulations over South America at the seasonal scale, using the MM5 model. The simulations cover a 3-month period for the austral spring season. Several four-member ensembles were performed in order to quantify the uncertainty due to: the internal variability; the definition of the regional model domain; the choice of physical parameterizations and the selection of physical parameters within a particular cumulus scheme. The uncertainty was measured by means of the spread among individual members of each ensemble during the integration period. Results show that the internal variability, triggered by differences in the initial conditions, represents the lowest level of uncertainty for every variable analyzed. The geographic distribution of the spread among ensemble members depends on the variable: for precipitation and temperature the largest spread is found over tropical South America while for the mean sea level pressure the largest spread is located over the southeastern Atlantic Ocean, where large synoptic-scale activity occurs. Using nudging techniques to ingest the boundary conditions reduces dramatically the internal variability. The uncertainty due to the domain choice displays a similar spatial pattern compared with the internal variability, except for the mean sea level pressure field, though its magnitude is larger all over the model domain for every variable. The largest spread among ensemble members is found for the ensemble in which different combinations of physical parameterizations are selected. The perturbed physics ensemble produces a level of uncertainty slightly larger than the internal variability. This study suggests that no matter what the source of uncertainty is, the geographical distribution of the spread among members of the ensembles is invariant, particularly for precipitation and temperature. (orig.)

  10. Cross-section uncertainty study of the NET shielding blanket

    Jaeger, J.F.

    1990-11-01

    The Next European Torus (NET) is foreseen as the next step in the European development towards the controlled use of thermonuclear fusion. Detail design of the shielding blanket protecting the peripherals, more especially the super-conducting coils, is well advanced. A cross-section uncertainty, i.e. a study of the expected inaccuracy due to the nuclear cross-section data, has been done for the neutron-gamma reactions in the insulation of the coils for such a design. As an extension of previous work on the NET shielding blanket (e.g. MCNP calculations), it was deemed necessary to estimate the accuracy attainable with transport codes in view of the uncertainties in microscopic cross-sections. The code used, SENSIBL, is based on perturbation theory and uses covariance files, COVFILS-2, for the cross-section data. This necessitates forward and adjoint flux calculations with a transport code (e.g. ONEDANT, TRISM) and folding the information contained in these coupled fluxes with the accuracy estimates of the evaluators of the ENDF/B-V files. Transport, P 5 S 12 , calculations were done with the ONEDANT code, for a shielding blanket design with 714 MW plasma fusion power. Several runs were done to obtain well converged forward and adjoint fluxes (ca. 1%). The forward and adjoint integral responses agree to 2%, which is consistent with the above accuracy. The n-γ response was chosen as it is typical of the general accuracy and is available for all materials considered. The present version of SENSIBL allows direct use of the geometric files of ONEDANT (or TRISM) which simplifies the input. Covariance data is not available at present in COVFILS-2 for all of the materials considered. Only H, C, N, O, Al, Si, Fe, Ni, and Pb could be considered, the big absentee being copper. The resulting uncertainty for the neutron-gamma reactions in the insulation of the coil was found to be 17%. Simulating copper by aluminium produces a negligible increase in the uncertainty, mainly

  11. Accounting for uncertainty factors in biodiversity impact assessment: lessons from a case study

    Geneletti, D.; Beinat, E.; Chung, C.F.; Fabbri, A.G.; Scholten, H.J.

    2003-01-01

    For an Environmental Impact Statement (EIS) to effectively contribute to decision-making, it must include one crucial step: the estimation of the uncertainty factors affecting the impact evaluation and of their effect on the evaluation results. Knowledge of the uncertainties better orients the strategy of the decision-makers and underlines the most critical data or methodological steps of the procedure. Accounting for uncertainty factors is particularly relevant when dealing with ecological impacts, whose forecasts are typically affected by a high degree of simplification. By means of a case study dealing with the evaluation of road alternatives, this paper explores and discusses the main uncertainties that are related to the typical stages of a biodiversity impact assessment: uncertainty in the data that are used, in the methodologies that are applied, and in the value judgments provided by the experts. Subsequently, the effects of such uncertainty factors are tracked back to the result of the evaluation, i.e., to the relative performance of the project alternatives under consideration. This allows to test the sensitivity of the results, and consequently to provide a more informative ranking of the alternatives. The papers concludes by discussing the added-value for decision-making provided by uncertainty analysis within EIA

  12. Evaluation of risk impact of changes to Completion Times addressing model and parameter uncertainties

    Martorell, S.; Martón, I.; Villamizar, M.; Sánchez, A.I.; Carlos, S.

    2014-01-01

    This paper presents an approach and an example of application for the evaluation of risk impact of changes to Completion Times within the License Basis of a Nuclear Power Plant based on the use of the Probabilistic Risk Assessment addressing identification, treatment and analysis of uncertainties in an integrated manner. It allows full development of a three tired approach (Tier 1–3) following the principles of the risk-informed decision-making accounting for uncertainties as proposed by many regulators. Completion Time is the maximum outage time a safety related equipment is allowed to be down, e.g. for corrective maintenance, which is established within the Limiting Conditions for Operation included into Technical Specifications for operation of a Nuclear Power Plant. The case study focuses on a Completion Time change of the Accumulators System of a Nuclear Power Plant using a level 1 PRA. It focuses on several sources of model and parameter uncertainties. The results obtained show the risk impact of the proposed CT change including both types of epistemic uncertainties is small as compared with current safety goals of concern to Tier 1. However, what concerns to Tier 2 and 3, the results obtained show how the use of some traditional and uncertainty importance measures helps in identifying high risky configurations that should be avoided in NPP technical specifications no matter the duration of CT (Tier 2), and other configurations that could take part of a configuration risk management program (Tier 3). - Highlights: • New approach for evaluation of risk impact of changes to Completion Times. • Integrated treatment and analysis of model and parameter uncertainties. • PSA based application to support risk-informed decision-making. • Measures of importance for identification of risky configurations. • Management of important safety issues to accomplish safety goals

  13. Evaluation of the uncertainty of environmental measurements of radioactivity

    Heydorn, K.

    2003-01-01

    Full text: The almost universal acceptance of the concept of uncertainty has led to its introduction into the ISO 17025 standard for general requirements to testing and calibration laboratories. This means that not only scientists, but also legislators, politicians, the general population - and perhaps even the press - expect to see all future results associated with an expression of their uncertainty. Results obtained by measurement of radioactivity have routinely been associated with an expression of their uncertainty, based on the so-called counting statistics. This is calculated together with the actual result on the assumption that the number of counts observed has a Poisson distribution with equal mean and variance. Most of the nuclear scientific community has therefore assumed that it already complied with the latest ISO 17025 requirements. Counting statistics, however, express only the variability observed among repeated measurements of the same sample under the same counting conditions, which is equivalent to the term repeatability used in quantitative analysis. Many other sources of uncertainty need to be taken into account before a statement of the uncertainty of the actual result can be made. As the first link in the traceability chain calibration is always an important uncertainty component in any kind of measurement. For radioactivity measurements in particular we find that counting geometry assumes the greatest importance, because it is often not possible to measure a standard and a control sample under exactly the same conditions. In the case of large samples we have additional uncertainty components associated with sample heterogeneity and its influence on self-absorption and counting efficiency. In low-level environmental measurements we have an additional risk of sample contamination, but the most important contribution to uncertainty is usually the representativity of the sample being analysed. For uniform materials this can be expressed by the

  14. Impacts of uncertainties in weather and streamflow observations in calibration and evaluation of an elevation distributed HBV-model

    Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.

    2012-04-01

    The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station

  15. Nordic reference study on uncertainty and sensitivity analysis

    Hirschberg, S.; Jacobsson, P.; Pulkkinen, U.; Porn, K.

    1989-01-01

    This paper provides a review of the first phase of Nordic reference study on uncertainty and sensitivity analysis. The main objective of this study is to use experiences form previous Nordic Benchmark Exercises and reference studies concerning critical modeling issues such as common cause failures and human interactions, and to demonstrate the impact of associated uncertainties on the uncertainty of the investigated accident sequence. This has been done independently by three working groups which used different approaches to modeling and to uncertainty analysis. The estimated uncertainty interval for the analyzed accident sequence is large. Also the discrepancies between the groups are substantial but can be explained. Sensitivity analyses which have been carried out concern e.g. use of different CCF-quantification models, alternative handling of CCF-data, time windows for operator actions and time dependences in phase mission operation, impact of state-of-knowledge dependences and ranking of dominating uncertainty contributors. Specific findings with respect to these issues are summarized in the paper

  16. Accounting for uncertainty in evaluating water quality impacts of urban development plan

    Zhou Jiquan; Liu Yi; Chen Jining

    2010-01-01

    The implementation of urban development plans causes land use change, which can have significant environmental impacts. In light of this, environmental concerns should be considered sufficiently at an early stage of the planning process. However, uncertainties existing in urban development plans hamper the application of strategic environmental assessment, which is applied to evaluate the environmental impacts of policies, plans and programs. This study develops an integrated assessment method based on accounting uncertainty of environmental impacts. And the proposed method consists of four main steps: (1) designing scenarios of economic scale and industrial structure, (2) sampling for possible land use layouts, (3) evaluating each sample's environmental impact, and (4) identifying environmentally sensitive industries. In doing so, uncertainties of environmental impacts can be accounted. Then environmental risk, overall environmental pressure and potential extreme environmental impact of urban development plans can be analyzed, and environmentally sensitive factors can be identified, especially under considerations of uncertainties. It can help decision-makers enhance environmental consideration and take measures in the early stage of decision-making.

  17. New Monte Carlo-based method to evaluate fission fraction uncertainties for the reactor antineutrino experiment

    Ma, X.B., E-mail: maxb@ncepu.edu.cn; Qiu, R.M.; Chen, Y.X.

    2017-02-15

    Uncertainties regarding fission fractions are essential in understanding antineutrino flux predictions in reactor antineutrino experiments. A new Monte Carlo-based method to evaluate the covariance coefficients between isotopes is proposed. The covariance coefficients are found to vary with reactor burnup and may change from positive to negative because of balance effects in fissioning. For example, between {sup 235}U and {sup 239}Pu, the covariance coefficient changes from 0.15 to −0.13. Using the equation relating fission fraction and atomic density, consistent uncertainties in the fission fraction and covariance matrix were obtained. The antineutrino flux uncertainty is 0.55%, which does not vary with reactor burnup. The new value is about 8.3% smaller. - Highlights: • The covariance coefficients between isotopes vs reactor burnup may change its sign because of two opposite effects. • The relation between fission fraction uncertainty and atomic density are first studied. • A new MC-based method of evaluating the covariance coefficients between isotopes was proposed.

  18. Examples of measurement uncertainty evaluations in accordance with the revised GUM

    Runje, B.; Horvatic, A.; Alar, V.; Medic, S.; Bosnjakovic, A.

    2016-11-01

    The paper presents examples of the evaluation of uncertainty components in accordance with the current and revised Guide to the expression of uncertainty in measurement (GUM). In accordance with the proposed revision of the GUM a Bayesian approach was conducted for both type A and type B evaluations.The law of propagation of uncertainty (LPU) and the law of propagation of distribution applied through the Monte Carlo method, (MCM) were used to evaluate associated standard uncertainties, expanded uncertainties and coverage intervals. Furthermore, the influence of the non-Gaussian dominant input quantity and asymmetric distribution of the output quantity y on the evaluation of measurement uncertainty was analyzed. In the case when the probabilistically coverage interval is not symmetric, the coverage interval for the probability P is estimated from the experimental probability density function using the Monte Carlo method. Key highlights of the proposed revision of the GUM were analyzed through a set of examples.

  19. Evaluation Procedures of Random Uncertainties in Theoretical Calculations of Cross Sections and Rate Coefficients

    Kokoouline, V.; Richardson, W.

    2014-01-01

    Uncertainties in theoretical calculations may include: • Systematic uncertainty: Due to applicability limits of the chosen model. • Random: Within a model, uncertainties of model parameters result in uncertainties of final results (such as cross sections). • If uncertainties of experimental and theoretical data are known, for the purpose of data evaluation (to produce recommended data), one should combine two data sets to produce the best guess data with the smallest possible uncertainty. In many situations, it is possible to assess the accuracy of theoretical calculations because theoretical models usually rely on parameters that are uncertain, but not completely random, i.e. the uncertainties of the parameters of the models are approximately known. If there are one or several such parameters with corresponding uncertainties, even if some or all parameters are correlated, the above approach gives a conceptually simple way to calculate uncertainties of final cross sections (uncertainty propagation). Numerically, the statistical approach to the uncertainty propagation could be computationally expensive. However, in situations, where uncertainties are considered to be as important as the actual cross sections (for data validation or benchmark calculations, for example), such a numerical effort is justified. Having data from different sources (say, from theory and experiment), a systematic statistical approach allows one to compare the data and produce “unbiased” evaluated data with improved uncertainties, if uncertainties of initial data from different sources are available. Without uncertainties, the data evaluation/validation becomes impossible. This is the reason why theoreticians should assess the accuracy of their calculations in one way or another. A statistical and systematic approach, similar to the described above, is preferable.

  20. Uncertainties assessment for safety margins evaluation in MTR reactors core thermal-hydraulic design

    Gimenez, M.; Schlamp, M.; Vertullo, A.

    2002-01-01

    This report contains a bibliographic review and a critical analysis of different methodologies used for uncertainty evaluation in research reactors core safety related parameters. Different parameters where uncertainties are considered are also presented and discussed, as well as their intrinsic nature regarding the way their uncertainty combination must be done. Finally a combined statistical method with direct propagation of uncertainties and a set of basic parameters as wall and DNB temperatures, CHF, PRD and their respective ratios where uncertainties should be considered is proposed. (author)

  1. Evaluating uncertainties in the cross-calibration of parallel ion chambers used in electron beam radiotherapy

    Anderson, Ernani; Travassos, Paulo; Ferreira, Max da Silva; Carvalho, Samira Marques de; Silva, Michele Maria da; Peixoto, Jose Guilherme Pereira; Salmon Junior, Helio Augusto

    2015-01-01

    This study aims to estimative the combined standard uncertainty for a detector parallel plate used for dosimetry of electron beams in linear accelerators for radiotherapy, which has been calibrated by the cross-calibration method. Keeping the combined standard uncertainty next of the uncertainty informed in the calibration certificate of the reference chamber, become possible establish the calibration factor of the detector. The combined standard uncertainty obtained in this study was 2.5 %. (author)

  2. Sustainability Risk Evaluation for Large-Scale Hydropower Projects with Hybrid Uncertainty

    Weiyao Tang

    2018-01-01

    Full Text Available As large-scale hydropower projects are influenced by many factors, risk evaluations are complex. This paper considers a hydropower project as a complex system from the perspective of sustainability risk, and divides it into three subsystems: the natural environment subsystem, the eco-environment subsystem and the socioeconomic subsystem. Risk-related factors and quantitative dimensions of each subsystem are comprehensively analyzed considering uncertainty of some quantitative dimensions solved by hybrid uncertainty methods, including fuzzy (e.g., the national health degree, the national happiness degree, the protection of cultural heritage, random (e.g., underground water levels, river width, and fuzzy random uncertainty (e.g., runoff volumes, precipitation. By calculating the sustainability risk-related degree in each of the risk-related factors, a sustainable risk-evaluation model is built. Based on the calculation results, the critical sustainability risk-related factors are identified and targeted to reduce the losses caused by sustainability risk factors of the hydropower project. A case study at the under-construction Baihetan hydropower station is presented to demonstrate the viability of the risk-evaluation model and to provide a reference for the sustainable risk evaluation of other large-scale hydropower projects.

  3. Uncertainty Evaluation of Reactivity Coefficients for a large advanced SFR Core Design

    Khamakhem, Wassim; Rimpault, Gerald

    2008-01-01

    Sodium Cooled Fast Reactors are currently being reshaped in order to meet Generation IV goals on economics, safety and reliability, sustainability and proliferation resistance. Recent studies have led to large SFR cores for a 3600 MWth power plants, cores which exhibit interesting features. The designs have had to balance between competing aspects such as sustainability and safety characteristics. Sustainability in neutronic terms is translated into positive breeding gain and safety into rather low Na void reactivity effects. The studies have been done on two SFR concepts using oxide and carbide fuels. The use of the sensitivity theory in the ERANOS determinist code system has been used. Calculations have been performed with different sodium evaluations: JEF2.2, ERALIB-1 and the most recent JEFF3.1 and ENDF/B-VII in order to make a broad comparison. Values for the Na void reactivity effect exhibit differences as large as 14% when using the different sodium libraries. Uncertainties due to nuclear data on the reactivity coefficients were performed with BOLNA variances-covariances data, the Na Void Effect uncertainties are near to 12% at 1σ. Since, the uncertainties are far beyond the target accuracy for a design achieving high performance, two directions are envisaged: the first one is to perform new differential measurements or in a second attempt use integral experiments to improve effectively the nuclear data set and its uncertainties such as performed in the past with ERALIB1. (authors)

  4. Insights into water managers' perception and handling of uncertainties - a study of the role of uncertainty in practitioners' planning and decision-making

    Höllermann, Britta; Evers, Mariele

    2017-04-01

    Planning and decision-making under uncertainty is common in water management due to climate variability, simplified models, societal developments, planning restrictions just to name a few. Dealing with uncertainty can be approached from two sites, hereby affecting the process and form of communication: Either improve the knowledge base by reducing uncertainties or apply risk-based approaches to acknowledge uncertainties throughout the management process. Current understanding is that science more strongly focusses on the former approach, while policy and practice are more actively applying a risk-based approach to handle incomplete and/or ambiguous information. The focus of this study is on how water managers perceive and handle uncertainties at the knowledge/decision interface in their daily planning and decision-making routines. How they evaluate the role of uncertainties for their decisions and how they integrate this information into the decision-making process. Expert interviews and questionnaires among practitioners and scientists provided an insight into their perspectives on uncertainty handling allowing a comparison of diverse strategies between science and practice as well as between different types of practitioners. Our results confirmed the practitioners' bottom up approach from potential measures upwards instead of impact assessment downwards common in science-based approaches. This science-practice gap may hinder effective uncertainty integration and acknowledgement in final decisions. Additionally, the implementation of an adaptive and flexible management approach acknowledging uncertainties is often stalled by rigid regulations favouring a predict-and-control attitude. However, the study showed that practitioners' level of uncertainty recognition varies with respect to his or her affiliation to type of employer and business unit, hence, affecting the degree of the science-practice-gap with respect to uncertainty recognition. The level of working

  5. Evaluation of measuring results, statement of uncertainty in dosimeter calibrations

    Reich, H.

    1978-05-01

    The method described starts from the requirement that the quantitative statement of a measuring result in dosimetry should contain at least three figures: 1) the measured value or the best estimate of the quantity to be measured, 2) the uncertainty of this value given by a figure, which indicates a certain range around the measured value, and which is strongly linked with 3) a figure for the confidence level of this range, i.e. the probability that the (unknown) correct value is embraced by the given uncertainty range. How the figures 2) and 3) can be obtained and how they should be quoted in calibration certificates is the subject of these lectures. In addition, the means by which the method may be extended on determining the uncertainty of a measurement performed under conditions which deviate from the calibration conditt ions is briefly described. (orig.) [de

  6. Evaluating data worth for ground-water management under uncertainty

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring

  7. Multi-criteria evaluation of wastewater treatment plant control strategies under uncertainty.

    Flores-Alsina, Xavier; Rodríguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2008-11-01

    The evaluation of activated sludge control strategies in wastewater treatment plants (WWTP) via mathematical modelling is a complex activity because several objectives; e.g. economic, environmental, technical and legal; must be taken into account at the same time, i.e. the evaluation of the alternatives is a multi-criteria problem. Activated sludge models are not well characterized and some of the parameters can present uncertainty, e.g. the influent fractions arriving to the facility and the effect of either temperature or toxic compounds on the kinetic parameters, having a strong influence in the model predictions used during the evaluation of the alternatives and affecting the resulting rank of preferences. Using a simplified version of the IWA Benchmark Simulation Model No. 2 as a case study, this article shows the variations in the decision making when the uncertainty in activated sludge model (ASM) parameters is either included or not during the evaluation of WWTP control strategies. This paper comprises two main sections. Firstly, there is the evaluation of six WWTP control strategies using multi-criteria decision analysis setting the ASM parameters at their default value. In the following section, the uncertainty is introduced, i.e. input uncertainty, which is characterized by probability distribution functions based on the available process knowledge. Next, Monte Carlo simulations are run to propagate input through the model and affect the different outcomes. Thus (i) the variation in the overall degree of satisfaction of the control objectives for the generated WWTP control strategies is quantified, (ii) the contributions of environmental, legal, technical and economic objectives to the existing variance are identified and finally (iii) the influence of the relative importance of the control objectives during the selection of alternatives is analyzed. The results show that the control strategies with an external carbon source reduce the output uncertainty

  8. How much is new information worth? Evaluating the financial benefit of resolving management uncertainty

    Maxwell, Sean L.; Rhodes, Jonathan R.; Runge, Michael C.; Possingham, Hugh P.; Ng, Chooi Fei; McDonald Madden, Eve

    2015-01-01

    Conservation decision-makers face a trade-off between spending limited funds on direct management action, or gaining new information in an attempt to improve management performance in the future. Value-of-information analysis can help to resolve this trade-off by evaluating how much management performance could improve if new information was gained. Value-of-information analysis has been used extensively in other disciplines, but there are only a few examples where it has informed conservation planning, none of which have used it to evaluate the financial value of gaining new information. We address this gap by applying value-of-information analysis to the management of a declining koala Phascolarctos cinereuspopulation. Decision-makers responsible for managing this population face uncertainty about survival and fecundity rates, and how habitat cover affects mortality threats. The value of gaining new information about these uncertainties was calculated using a deterministic matrix model of the koala population to find the expected population growth rate if koala mortality threats were optimally managed under alternative model hypotheses, which represented the uncertainties faced by koala managers. Gaining new information about survival and fecundity rates and the effect of habitat cover on mortality threats will do little to improve koala management. Across a range of management budgets, no more than 1·7% of the budget should be spent on resolving these uncertainties. The value of information was low because optimal management decisions were not sensitive to the uncertainties we considered. Decisions were instead driven by a substantial difference in the cost efficiency of management actions. The value of information was up to forty times higher when the cost efficiencies of different koala management actions were similar. Synthesis and applications. This study evaluates the ecological and financial benefits of gaining new information to inform a conservation

  9. Uncertainty evaluation in 2008 IAEA proficiency test using phosphogypsum

    Dias, Fabiana F.; Taddei, Maria Helena T.; Geraldo, Bianca; Jacomino, Vanusa M.F.; Pontedeiro, Elizabeth M.B.

    2009-01-01

    LAPOC participated in the 2008 IAEA ALMERA (Analytical Laboratories for the Measurement of Environmental Radioactivity) Proficiency Test (PT) for phosphogypsum, which is a NORM (Naturally Occurring Radioactive Material) derived from phosphate industry, an abundant solid waste of low cost. Its reutilization would avoid environmental impact in large areas where the product is stored. Research involving possible uses for phosphogypsum is ever more important, from economic, technological, and environmental points of view. This paper describes results from this Proficiency Test (measured radionuclides: 234 U, 238 U, 226 Ra, 230 Th, and 210 Pb), as well as a short description of the nuclear analytical techniques emphasizing sources of uncertainty, such as Alpha Spectrometry (Alpha Analyst, Canberra, surface barrier detectors) and Gamma Spectrometry (Canberra, Hyper Pure Germanium Detector with 45 % efficiency). Corrections for decay, reference date, and recovery were applied. As an example, results obtained for 210 Pb through the use of a specific uncertainty calculation software are presented below. Each parameter whose uncertainty is quantified was carefully described, with appropriate numerical value and unit, to determine its partial contribution to the combined total uncertainty. Results from PTs provide independent information on performance of a Laboratory and have an important role in method validation; especially because it allows the assessment of the method performance over an entire range of concentrations and matrices. PTs are an important tool to demonstrate equivalence of measurements, if not their metrological comparability, and to promote education and improvement of Laboratory practice. (author)

  10. Using hybrid method to evaluate the green performance in uncertainty.

    Tseng, Ming-Lang; Lan, Lawrence W; Wang, Ray; Chiu, Anthony; Cheng, Hui-Ping

    2011-04-01

    Green performance measure is vital for enterprises in making continuous improvements to maintain sustainable competitive advantages. Evaluation of green performance, however, is a challenging task due to the dependence complexity of the aspects, criteria, and the linguistic vagueness of some qualitative information and quantitative data together. To deal with this issue, this study proposes a novel approach to evaluate the dependence aspects and criteria of firm's green performance. The rationale of the proposed approach, namely green network balanced scorecard, is using balanced scorecard to combine fuzzy set theory with analytical network process (ANP) and importance-performance analysis (IPA) methods, wherein fuzzy set theory accounts for the linguistic vagueness of qualitative criteria and ANP converts the relations among the dependence aspects and criteria into an intelligible structural modeling used IPA. For the empirical case study, four dependence aspects and 34 green performance criteria for PCB firms in Taiwan were evaluated. The managerial implications are discussed.

  11. Evaluation of incremental reactivity and its uncertainty in Southern California.

    Martien, Philip T; Harley, Robert A; Milford, Jana B; Russell, Armistead G

    2003-04-15

    The incremental reactivity (IR) and relative incremental reactivity (RIR) of carbon monoxide and 30 individual volatile organic compounds (VOC) were estimated for the South Coast Air Basin using two photochemical air quality models: a 3-D, grid-based model and a vertically resolved trajectory model. Both models include an extended version of the SAPRC99 chemical mechanism. For the 3-D modeling, the decoupled direct method (DDM-3D) was used to assess reactivities. The trajectory model was applied to estimate uncertainties in reactivities due to uncertainties in chemical rate parameters, deposition parameters, and emission rates using Monte Carlo analysis with Latin hypercube sampling. For most VOC, RIRs were found to be consistent in rankings with those produced by Carter using a box model. However, 3-D simulations show that coastal regions, upwind of most of the emissions, have comparatively low IR but higher RIR than predicted by box models for C4-C5 alkenes and carbonyls that initiate the production of HOx radicals. Biogenic VOC emissions were found to have a lower RIR than predicted by box model estimates, because emissions of these VOC were mostly downwind of the areas of primary ozone production. Uncertainties in RIR of individual VOC were found to be dominated by uncertainties in the rate parameters of their primary oxidation reactions. The coefficient of variation (COV) of most RIR values ranged from 20% to 30%, whereas the COV of absolute incremental reactivity ranged from about 30% to 40%. In general, uncertainty and variability both decreased when relative rather than absolute reactivity metrics were used.

  12. Uncertainty evaluation of thickness and warp of a silicon wafer measured by a spectrally resolved interferometer

    Praba Drijarkara, Agustinus; Gergiso Gebrie, Tadesse; Lee, Jae Yong; Kang, Chu-Shik

    2018-06-01

    Evaluation of uncertainty of thickness and gravity-compensated warp of a silicon wafer measured by a spectrally resolved interferometer is presented. The evaluation is performed in a rigorous manner, by analysing the propagation of uncertainty from the input quantities through all the steps of measurement functions, in accordance with the ISO Guide to the Expression of Uncertainty in Measurement. In the evaluation, correlation between input quantities as well as uncertainty attributed to thermal effect, which were not included in earlier publications, are taken into account. The temperature dependence of the group refractive index of silicon was found to be nonlinear and varies widely within a wafer and also between different wafers. The uncertainty evaluation described here can be applied to other spectral interferometry applications based on similar principles.

  13. Evaluation of the measurement uncertainty when measuring the resistance of solid isolating materials to tracking

    Stare, E.; Beges, G.; Drnovsek, J.

    2006-07-01

    This paper presents the results of research into the measurement of the resistance of solid isolating materials to tracking. Two types of tracking were investigated: the proof tracking index (PTI) and the comparative tracking index (CTI). Evaluation of the measurement uncertainty in a case study was performed using a test method in accordance with the IEC 60112 standard. In the scope of the tests performed here, this particular test method was used to ensure the safety of electrical appliances. According to the EN ISO/IEC 17025 standard (EN ISO/IEC 17025), in the process of conformity assessment, the evaluation of the measurement uncertainty of the test method should be carried out. In the present article, possible influential parameters that are in accordance with the third and fourth editions of the standard IEC 60112 are discussed. The differences, ambiguities or lack of guidance referring to both editions of the standard are described in the article 'Ambiguities in technical standards—case study IEC 60112—measuring the resistance of solid isolating materials to tracking' (submitted for publication). Several hundred measurements were taken in the present experiments in order to form the basis for the results and conclusions presented. A specific problem of the test (according to the IEC 60112 standard) is the great variety of influential physical parameters (mechanical, electrical, chemical, etc) that can affect the results. At the end of the present article therefore, there is a histogram containing information on the contributions to the measurement uncertainty.

  14. Evaluation of uncertainties in selected environmental dispersion models

    Little, C.A.; Miller, C.W.

    1979-01-01

    Compliance with standards of radiation dose to the general public has necessitated the use of dispersion models to predict radionuclide concentrations in the environment due to releases from nuclear facilities. Because these models are only approximations of reality and because of inherent variations in the input parameters used in these models, their predictions are subject to uncertainty. Quantification of this uncertainty is necessary to assess the adequacy of these models for use in determining compliance with protection standards. This paper characterizes the capabilities of several dispersion models to predict accurately pollutant concentrations in environmental media. Three types of models are discussed: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations

  15. Uncertainty and Preference Modelling for Multiple Criteria Vehicle Evaluation

    Qiuping Yang

    2010-12-01

    Full Text Available A general framework for vehicle assessment is proposed based on both mass survey information and the evidential reasoning (ER approach. Several methods for uncertainty and preference modeling are developed within the framework, including the measurement of uncertainty caused by missing information, the estimation of missing information in original surveys, the use of nonlinear functions for data mapping, and the use of nonlinear functions as utility function to combine distributed assessments into a single index. The results of the investigation show that various measures can be used to represent the different preferences of decision makers towards the same feedback from respondents. Based on the ER approach, credible and informative analysis can be conducted through the complete understanding of the assessment problem in question and the full exploration of available information.

  16. Uncertainty evaluation of reliability of shutdown system of a medium size fast breeder reactor

    Zeliang, Chireuding; Singh, Om Pal, E-mail: singhop@iitk.ac.in; Munshi, Prabhat

    2016-11-15

    Highlights: • Uncertainty analysis of reliability of Shutdown System is carried out. • Monte Carlo method of sampling is used. • The effect of various reliability improvement measures of SDS are accounted. - Abstract: In this paper, results are presented on the uncertainty evaluation of the reliability of Shutdown System (SDS) of a Medium Size Fast Breeder Reactor (MSFBR). The reliability analysis results are of Kumar et al. (2005). The failure rate of the components of SDS are taken from International literature and it is assumed that these follow log-normal distribution. Fault tree method is employed to propagate the uncertainty in failure rate from components level to shutdown system level. The beta factor model is used to account different extent of diversity. The Monte Carlo sampling technique is used for the analysis. The results of uncertainty analysis are presented in terms of the probability density function, cumulative distribution function, mean, variance, percentile values, confidence intervals, etc. It is observed that the spread in the probability distribution of SDS failure rate is less than SDS components failure rate and ninety percent values of the failure rate of SDS falls below the target value. As generic values of failure rates are used, sensitivity analysis is performed with respect to failure rate of control and safety rods and beta factor. It is discovered that a large increase in failure rate of SDS rods is not carried to SDS system failure proportionately. The failure rate of SDS is very sensitive to the beta factor of common cause failure between the two systems of SDS. The results of the study provide insight in the propagation of uncertainty in the failure rate of SDS components to failure rate of shutdown system.

  17. The use of kragten spreadsheets for uncertainty evaluation of uranium potentiometric analysis by the Brazilian Safeguards Laboratory

    Silva, Jose Wanderley S. da; Barros, Pedro Dionisio de; Araujo, Radier Mario S. de

    2009-01-01

    In safeguards, independent analysis of uranium content and enrichment of nuclear materials to verify operator's declarations is an important tool to evaluate the accountability system applied by nuclear installations. This determination may be performed by nondestructive (NDA) methods, generally done in the field using portable radiation detection systems, or destructive (DA) methods by chemical analysis when more accurate and precise results are necessary. Samples for DA analysis are collected by inspectors during safeguards inspections and sent to Safeguards Laboratory (LASAL) of the Brazilian Nuclear Energy Commission - (CNEN), where the analysis take place. The method used by LASAL for determination of uranium in different physical and chemical forms is the Davies and Gray/NBL using an automatic potentiometric titrator, which performs the titration of uranium IV by a standard solution of K 2 Cr 2 O 7 . Uncertainty budgets have been determined based on the concepts of the ISO 'Guide to the Expression of Uncertainty in Measurement' (GUM). In order to simplify the calculation of the uncertainty, a computational tool named Kragten Spreadsheet was used. Such spreadsheet uses the concepts established by the GUM and provides results that numerically approximates to those obtained by propagation of uncertainty with analytically determined sensitivity coefficients. The main parameters (input quantities) interfering on the uncertainty were studied. In order to evaluate their contribution in the final uncertainty, the uncertainties of all steps of the analytical method were estimated and compiled. (author)

  18. How should epistemic uncertainty in modelling water resources management problems shape evaluations of their operations?

    Dobson, B.; Pianosi, F.; Reed, P. M.; Wagener, T.

    2017-12-01

    In previous work, we have found that water supply companies are typically hesitant to use reservoir operation tools to inform their release decisions. We believe that this is, in part, due to a lack of faith in the fidelity of the optimization exercise with regards to its ability to represent the real world. In an attempt to quantify this, recent literature has studied the impact on performance from uncertainty arising in: forcing (e.g. reservoir inflows), parameters (e.g. parameters for the estimation of evaporation rate) and objectives (e.g. worst first percentile or worst case). We suggest that there is also epistemic uncertainty in the choices made during model creation, for example in the formulation of an evaporation model or aggregating regional storages. We create `rival framings' (a methodology originally developed to demonstrate the impact of uncertainty arising from alternate objective formulations), each with different modelling choices, and determine their performance impacts. We identify the Pareto approximate set of policies for several candidate formulations and then make them compete with one another in a large ensemble re-evaluation in each other's modelled spaces. This enables us to distinguish the impacts of different structural changes in the model used to evaluate system performance in an effort to generalize the validity of the optimized performance expectations.

  19. Object-oriented software for evaluating measurement uncertainty

    Hall, B. D.

    2013-05-01

    An earlier publication (Hall 2006 Metrologia 43 L56-61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language.

  20. Object-oriented software for evaluating measurement uncertainty

    Hall, B D

    2013-01-01

    An earlier publication (Hall 2006 Metrologia 43 L56–61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language. (paper)

  1. Evaluation of Parameter Uncertainty Reduction in Groundwater Flow Modeling Using Multiple Environmental Tracers

    Arnold, B. W.; Gardner, P.

    2013-12-01

    Calibration of groundwater flow models for the purpose of evaluating flow and aquifer heterogeneity typically uses observations of hydraulic head in wells and appropriate boundary conditions. Environmental tracers have a wide variety of decay rates and input signals in recharge, resulting in a potentially broad source of additional information to constrain flow rates and heterogeneity. A numerical study was conducted to evaluate the reduction in uncertainty during model calibration using observations of various environmental tracers and combinations of tracers. A synthetic data set was constructed by simulating steady groundwater flow and transient tracer transport in a high-resolution, 2-D aquifer with heterogeneous permeability and porosity using the PFLOTRAN software code. Data on pressure and tracer concentration were extracted at well locations and then used as observations for automated calibration of a flow and transport model using the pilot point method and the PEST code. Optimization runs were performed to estimate parameter values of permeability at 30 pilot points in the model domain for cases using 42 observations of: 1) pressure, 2) pressure and CFC11 concentrations, 3) pressure and Ar-39 concentrations, and 4) pressure, CFC11, Ar-39, tritium, and He-3 concentrations. Results show significantly lower uncertainty, as indicated by the 95% linear confidence intervals, in permeability values at the pilot points for cases including observations of environmental tracer concentrations. The average linear uncertainty range for permeability at the pilot points using pressure observations alone is 4.6 orders of magnitude, using pressure and CFC11 concentrations is 1.6 orders of magnitude, using pressure and Ar-39 concentrations is 0.9 order of magnitude, and using pressure, CFC11, Ar-39, tritium, and He-3 concentrations is 1.0 order of magnitude. Data on Ar-39 concentrations result in the greatest parameter uncertainty reduction because its half-life of 269

  2. Exploring the uncertainty in attributing sediment contributions in fingerprinting studies due to uncertainty in determining element concentrations in source areas.

    Gomez, Jose Alfonso; Owens, Phillip N.; Koiter, Alex J.; Lobb, David

    2016-04-01

    One of the major sources of uncertainty in attributing sediment sources in fingerprinting studies is the uncertainty in determining the concentrations of the elements used in the mixing model due to the variability of the concentrations of these elements in the source materials (e.g., Kraushaar et al., 2015). The uncertainty in determining the "true" concentration of a given element in each one of the source areas depends on several factors, among them the spatial variability of that element, the sampling procedure and sampling density. Researchers have limited control over these factors, and usually sampling density tends to be sparse, limited by time and the resources available. Monte Carlo analysis has been used regularly in fingerprinting studies to explore the probable solutions within the measured variability of the elements in the source areas, providing an appraisal of the probability of the different solutions (e.g., Collins et al., 2012). This problem can be considered analogous to the propagation of uncertainty in hydrologic models due to uncertainty in the determination of the values of the model parameters, and there are many examples of Monte Carlo analysis of this uncertainty (e.g., Freeze, 1980; Gómez et al., 2001). Some of these model analyses rely on the simulation of "virtual" situations that were calibrated from parameter values found in the literature, with the purpose of providing insight about the response of the model to different configurations of input parameters. This approach - evaluating the answer for a "virtual" problem whose solution could be known in advance - might be useful in evaluating the propagation of uncertainty in mixing models in sediment fingerprinting studies. In this communication, we present the preliminary results of an on-going study evaluating the effect of variability of element concentrations in source materials, sampling density, and the number of elements included in the mixing models. For this study a virtual

  3. Numerical Continuation Methods for Intrusive Uncertainty Quantification Studies

    Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Najm, Habib N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Phipps, Eric Todd [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-09-01

    Rigorous modeling of engineering systems relies on efficient propagation of uncertainty from input parameters to model outputs. In recent years, there has been substantial development of probabilistic polynomial chaos (PC) Uncertainty Quantification (UQ) methods, enabling studies in expensive computational models. One approach, termed ”intrusive”, involving reformulation of the governing equations, has been found to have superior computational performance compared to non-intrusive sampling-based methods in relevant large-scale problems, particularly in the context of emerging architectures. However, the utility of intrusive methods has been severely limited due to detrimental numerical instabilities associated with strong nonlinear physics. Previous methods for stabilizing these constructions tend to add unacceptably high computational costs, particularly in problems with many uncertain parameters. In order to address these challenges, we propose to adapt and improve numerical continuation methods for the robust time integration of intrusive PC system dynamics. We propose adaptive methods, starting with a small uncertainty for which the model has stable behavior and gradually moving to larger uncertainty where the instabilities are rampant, in a manner that provides a suitable solution.

  4. Verification and uncertainty evaluation of HELIOS/MASTER nuclear design system

    Song, Jae Seung; Kim, J. C.; Cho, B. O. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-03-01

    A nuclear design system HELIOS/MASTER was established and core follow calculations were performed for Yonggwang Unit 1 cycles 1 through 7 and Yonggwang Unit 3 cycles 1 through 2. The accuracy of HELIOS/MASTER system was evaluated by estimations of uncertainties of reactivity and peaking factors and by comparisons of the maximum differences of isothermal temperature coefficient, inverse boron worth and control rod worth with the CASMO-3/MASTER uncertainties. The reactivity uncertainty was estimated by 362 pcm, and the uncertainties of three-dimensional, axially integrated radial, and planar peaking factors were evaluated by 0.048, 0.034, and 0.044 in relative power unit, respectively. The maximum differences of isothermal temperature coefficient, inverse boron worth and control rod worth were within the CASMO-3/MASTER uncertainties. 17 refs., 17 figs., 10 tabs. (Author)

  5. Monte Carlo parameter studies and uncertainty analyses with MCNP5

    Brown, F. B.; Sweezy, J. E.; Hayes, R.

    2004-01-01

    A software tool called mcnp p study has been developed to automate the setup, execution, and collection of results from a series of MCNP5 Monte Carlo calculations. This tool provides a convenient means of performing parameter studies, total uncertainty analyses, parallel job execution on clusters, stochastic geometry modeling, and other types of calculations where a series of MCNP5 jobs must be performed with varying problem input specifications. (authors)

  6. Internal dose assessments: Uncertainty studies and update of ideas guidelines and databases within CONRAD project

    Marsh, J. W.; Castellani, C. M.; Hurtgen, C.; Lopez, M. A.; Andrasi, A.; Bailey, M. R.; Birchall, A.; Blanchardon, E.; Desai, A. D.; Dorrian, M. D.; Doerfel, H.; Koukouliou, V.; Luciani, A.; Malatova, I.; Molokanov, A.; Puncher, M.; Vrba, T.

    2008-01-01

    The work of Task Group 5.1 (uncertainty studies and revision of IDEAS guidelines) and Task Group 5.5 (update of IDEAS databases) of the CONRAD project is described. Scattering factor (SF) values (i.e. measurement uncertainties) have been calculated for different radionuclides and types of monitoring data using real data contained in the IDEAS Internal Contamination Database. Based upon this work and other published values, default SF values are suggested. Uncertainty studies have been carried out using both a Bayesian approach as well as a frequentist (classical) approach. The IDEAS guidelines have been revised in areas relating to the evaluation of an effective AMAD, guidance is given on evaluating wound cases with the NCRP wound model and suggestions made on the number and type of measurements required for dose assessment. (authors)

  7. Uncertainty theory

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  8. Sensitivity and uncertainty studies of the CRAC2 computer code

    Kocher, D.C.; Ward, R.C.; Killough, G.G.; Dunning, D.E. Jr.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1987-01-01

    The authors have studied the sensitivity of health impacts from nuclear reactor accidents, as predicted by the CRAC2 computer code, to the following sources of uncertainty: (1) the model for plume rise, (2) the model for wet deposition, (3) the meteorological bin-sampling procedure for selecting weather sequences with rain, (4) the dose conversion factors for inhalation as affected by uncertainties in the particle size of the carrier aerosol and the clearance rates of radionuclides from the respiratory tract, (5) the weathering half-time for external ground-surface exposure, and (6) the transfer coefficients for terrestrial foodchain pathways. Predicted health impacts usually showed little sensitivity to use of an alternative plume-rise model or a modified rain-bin structure in bin-sampling. Health impacts often were quite sensitive to use of an alternative wet-deposition model in single-trial runs with rain during plume passage, but were less sensitive to the model in bin-sampling runs. Uncertainties in the inhalation dose conversion factors had important effects on early injuries in single-trial runs. Latent cancer fatalities were moderately sensitive to uncertainties in the weathering half-time for ground-surface exposures, but showed little sensitivity to the transfer coefficients for terrestrial foodchain pathways. Sensitivities of CRAC2 predictions to uncertainties in the models and parameters also depended on the magnitude of the source term, and some of the effects on early health effects were comparable to those that were due only to selection of different sets of weather sequences in bin-sampling

  9. Robust uncertainty evaluation for system identification on distributed wireless platforms

    Crinière, Antoine; Döhler, Michael; Le Cam, Vincent; Mevel, Laurent

    2016-04-01

    Health monitoring of civil structures by system identification procedures from automatic control is now accepted as a valid approach. These methods provide frequencies and modeshapes from the structure over time. For a continuous monitoring the excitation of a structure is usually ambient, thus unknown and assumed to be noise. Hence, all estimates from the vibration measurements are realizations of random variables with inherent uncertainty due to (unknown) process and measurement noise and finite data length. The underlying algorithms are usually running under Matlab under the assumption of large memory pool and considerable computational power. Even under these premises, computational and memory usage are heavy and not realistic for being embedded in on-site sensor platforms such as the PEGASE platform. Moreover, the current push for distributed wireless systems calls for algorithmic adaptation for lowering data exchanges and maximizing local processing. Finally, the recent breakthrough in system identification allows us to process both frequency information and its related uncertainty together from one and only one data sequence, at the expense of computational and memory explosion that require even more careful attention than before. The current approach will focus on presenting a system identification procedure called multi-setup subspace identification that allows to process both frequencies and their related variances from a set of interconnected wireless systems with all computation running locally within the limited memory pool of each system before being merged on a host supervisor. Careful attention will be given to data exchanges and I/O satisfying OGC standards, as well as minimizing memory footprints and maximizing computational efficiency. Those systems are built in a way of autonomous operations on field and could be later included in a wide distributed architecture such as the Cloud2SM project. The usefulness of these strategies is illustrated on

  10. Evaluation of measurement uncertainty for purity of a monoterpenic acid by small-scale coulometry

    Norte, L. C.; de Carvalho, E. M.; Tappin, M. R. R.; Borges, P. P.

    2018-03-01

    Purity of the perylic acid (HPe) which is a monoterpenic acid from natural product (NP) with anti-inflammatory and anticancer properties was analyzed by small-scale coulometry (SSC), due to the low availability of HPe on the pharmaceutic market and its high cost. This work aims to present the evaluation of the measurements uncertainty from the purity of HPe by using SSC. Coulometric mean of purity obtained from 5 replicates resulted in 94.23% ± 0.88% (k = 2.06, for an approximately 95% confidence level). These studies aim in the future to develop the production of certified reference materials from NPs.

  11. Sensitivity and uncertainty studies of the CRAC2 computer code

    Kocher, D.C.; Ward, R.C.; Killough, G.G.; Dunning, D.E. Jr.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1985-05-01

    This report presents a study of the sensitivity of early fatalities, early injuries, latent cancer fatalities, and economic costs for hypothetical nuclear reactor accidents as predicted by the CRAC2 computer code (CRAC = Calculation of Reactor Accident Consequences) to uncertainties in selected models and parameters used in the code. The sources of uncertainty that were investigated in the CRAC2 sensitivity studies include (1) the model for plume rise, (2) the model for wet deposition, (3) the procedure for meteorological bin-sampling involving the selection of weather sequences that contain rain, (4) the dose conversion factors for inhalation as they are affected by uncertainties in the physical and chemical form of the released radionuclides, (5) the weathering half-time for external ground-surface exposure, and (6) the transfer coefficients for estimating exposures via terrestrial foodchain pathways. The sensitivity studies were performed for selected radionuclide releases, hourly meteorological data, land-use data, a fixed non-uniform population distribution, a single evacuation model, and various release heights and sensible heat rates. Two important general conclusions from the sensitivity and uncertainty studies are as follows: (1) The large effects on predicted early fatalities and early injuries that were observed in some of the sensitivity studies apparently are due in part to the presence of thresholds in the dose-response models. Thus, the observed sensitivities depend in part on the magnitude of the radionuclide releases. (2) Some of the effects on predicted early fatalities and early injuries that were observed in the sensitivity studies were comparable to effects that were due only to the selection of different sets of weather sequences in bin-sampling runs. 47 figs., 50 tabs

  12. Evaluation of uncertainty in geological framework models at Yucca Mountain, Nevada

    Bagtzoglou, A.C.; Stirewalt, G.L.; Henderson, D.B.; Seida, S.B.

    1995-01-01

    The first step towards determining compliance with the performance objectives for both the repository system and the geologic setting at Yucca Mountain requires the development of detailed geostratigraphic models. This paper proposes an approach for the evaluation of the degree of uncertainty inherent in geologic maps and associated three-dimensional geological models. Following this approach, an assessment of accuracy and completeness of the data and evaluation of conceptual uncertainties in the geological framework models can be performed

  13. Uncertainty of measurement for large product verification: evaluation of large aero gas turbine engine datums

    Muelaner, J E; Wang, Z; Keogh, P S; Brownell, J; Fisher, D

    2016-01-01

    Understanding the uncertainty of dimensional measurements for large products such as aircraft, spacecraft and wind turbines is fundamental to improving efficiency in these products. Much work has been done to ascertain the uncertainty associated with the main types of instruments used, based on laser tracking and photogrammetry, and the propagation of this uncertainty through networked measurements. Unfortunately this is not sufficient to understand the combined uncertainty of industrial measurements, which include secondary tooling and datum structures used to locate the coordinate frame. This paper presents for the first time a complete evaluation of the uncertainty of large scale industrial measurement processes. Generic analysis and design rules are proven through uncertainty evaluation and optimization for the measurement of a large aero gas turbine engine. This shows how the instrument uncertainty can be considered to be negligible. Before optimization the dominant source of uncertainty was the tooling design, after optimization the dominant source was thermal expansion of the engine; meaning that no further improvement can be made without measurement in a temperature controlled environment. These results will have a significant impact on the ability of aircraft and wind turbines to improve efficiency and therefore reduce carbon emissions, as well as the improved reliability of these products. (paper)

  14. STUDY ON MODELING AND VISUALIZING THE POSITIONAL UNCERTAINTY OF REMOTE SENSING IMAGE

    W. Jiao

    2016-06-01

    Full Text Available It is inevitable to bring about uncertainty during the process of data acquisition. The traditional method to evaluate the geometric positioning accuracy is usually by the statistical method and represented by the root mean square errors (RMSEs of control points. It is individual and discontinuous, so it is difficult to describe the error spatial distribution. In this paper the error uncertainty of each control point is deduced, and the uncertainty spatial distribution model of each arbitrary point is established. The error model is proposed to evaluate the geometric accuracy of remote sensing image. Then several visualization methods are studied to represent the discrete and continuous data of geometric uncertainties. The experiments show that the proposed evaluation method of error distribution model compared with the traditional method of RMSEs can get the similar results but without requiring the user to collect control points as checkpoints, and error distribution information calculated by the model can be provided to users along with the geometric image data. Additionally, the visualization methods described in this paper can effectively and objectively represents the image geometric quality, and also can help users probe the reasons of bringing the image uncertainties in some extent.

  15. Evaluation of mechanical precision and alignment uncertainties for an integrated CT/LINAC system

    Court, Laurence; Rosen, Isaac; Mohan, Radhe; Dong Lei

    2003-01-01

    A new integrated CT/LINAC combination, in which the CT scanner is inside the radiation therapy treatment room and the same patient couch is used for CT scanning and treatment (after a 180-degree couch rotation), should allow for accurate correction of interfractional setup errors. The purpose of this study was to evaluate the sources of uncertainties, and to measure the overall precision of this system. The following sources of uncertainty were identified: (1) the patient couch position on the LINAC side after a rotation, (2) the patient couch position on the CT side after a rotation, (3) the patient couch position as indicated by its digital readout, (4) the difference in couch sag between the CT and LINAC positions, (5) the precision of the CT coordinates, (6) the identification of fiducial markers from CT images, (7) the alignment of contours with structures in the CT images, and (8) the alignment with setup lasers. The largest single uncertainties (one standard deviation or 1 SD) were found in couch position on the CT side after a rotation (0.5 mm in the RL direction) and the alignment of contours with the CT images (0.4 mm in the SI direction). All other sources of uncertainty are less than 0.3 mm (1 SD). The overall precision of two setup protocols was investigated in a controlled phantom study. A protocol that relies heavily on the mechanical integrity of the system, and assumes a fixed relationship between the LINAC isocenter and the CT images, gave a predicted precision (1 SD) of 0.6, 0.7, and 0.6 mm in the SI, RL and AP directions, respectively. The second protocol reduces reliance on the mechanical precision of the total system, particularly the patient couch, by using radio-opaque fiducial markers to transfer the isocenter information from the LINAC side to the CT images. This protocol gave a slightly improved predicted precision of 0.5, 0.4, and 0.4 mm in the SI, RL and AP directions, respectively. The distribution of phantom position after CT

  16. Evaluating a multispecies adaptive management framework: Must uncertainty impede effective decision-making?

    Smith, David R.; McGowan, Conor P.; Daily, Jonathan P.; Nichols, James D.; Sweka, John A.; Lyons, James E.

    2013-01-01

    Application of adaptive management to complex natural resource systems requires careful evaluation to ensure that the process leads to improved decision-making. As part of that evaluation, adaptive policies can be compared with alternative nonadaptive management scenarios. Also, the value of reducing structural (ecological) uncertainty to achieving management objectives can be quantified.A multispecies adaptive management framework was recently adopted by the Atlantic States Marine Fisheries Commission for sustainable harvest of Delaware Bay horseshoe crabs Limulus polyphemus, while maintaining adequate stopover habitat for migrating red knots Calidris canutus rufa, the focal shorebird species. The predictive model set encompassed the structural uncertainty in the relationships between horseshoe crab spawning, red knot weight gain and red knot vital rates. Stochastic dynamic programming was used to generate a state-dependent strategy for harvest decisions given that uncertainty. In this paper, we employed a management strategy evaluation approach to evaluate the performance of this adaptive management framework. Active adaptive management was used by including model weights as state variables in the optimization and reducing structural uncertainty by model weight updating.We found that the value of information for reducing structural uncertainty is expected to be low, because the uncertainty does not appear to impede effective management. Harvest policy responded to abundance levels of both species regardless of uncertainty in the specific relationship that generated those abundances. Thus, the expected horseshoe crab harvest and red knot abundance were similar when the population generating model was uncertain or known, and harvest policy was robust to structural uncertainty as specified.Synthesis and applications. The combination of management strategy evaluation with state-dependent strategies from stochastic dynamic programming was an informative approach to

  17. Integrative evaluation for sustainable decisions of urban wastewater system management under uncertainty

    Hadjimichael, A.; Corominas, L.; Comas, J.

    2017-12-01

    With sustainable development as their overarching goal, urban wastewater system (UWS) managers need to take into account multiple social, economic, technical and environmental facets related to their decisions. In this complex decision-making environment, uncertainty can be formidable. It is present both in the ways the system is interpreted stochastically, but also in its natural ever-shifting behavior. This inherent uncertainty suggests that wiser decisions would be made under an adaptive and iterative decision-making regime. No decision-support framework has been presented in the literature to effectively addresses all these needs. The objective of this work is to describe such a conceptual framework to evaluate and compare alternative solutions for various UWS challenges within an adaptive management structure. Socio-economic aspects such as externalities are taken into account, along with other traditional criteria as necessary. Robustness, reliability and resilience analyses test the performance of the system against present and future variability. A valuation uncertainty analysis incorporates uncertain valuation assumptions in the decision-making process. The framework is demonstrated with an application to a case study presenting a typical problem often faced by managers: poor river water quality, increasing population, and more stringent water quality legislation. The application of the framework made use of: i) a cost-benefit analysis including monetized environmental benefits and damages; ii) a robustness analysis of system performance against future conditions; iii) reliability and resilience analyses of the system given contextual variability; and iv) a valuation uncertainty analysis of model parameters. The results suggest that the installation of bigger volumes would give rise to increased benefits despite larger capital costs, as well as increased robustness and resilience. Population numbers appear to affect the estimated benefits most, followed by

  18. Uncertainty evaluation for IIR (infinite impulse response) filtering using a state-space approach

    Link, Alfred; Elster, Clemens

    2009-01-01

    A novel method is proposed for evaluating the uncertainty associated with the output of a discrete-time IIR filter when the input signal is corrupted by additive noise and the filter coefficients are uncertain. This task arises, for instance, when the noise-corrupted output of a measurement system is compensated by a digital filter which has been designed on the basis of the characteristics of the measurement system. We assume that the noise is either stationary or uncorrelated, and we presume knowledge about its autocovariance function or its time-dependent variances, respectively. Uncertainty evaluation is considered in line with the 'Guide to the Expression of Uncertainty in Measurement'. A state-space representation is used to derive a calculation scheme which allows the uncertainties to be evaluated in an easy way and also enables real-time applications. The proposed procedure is illustrated by an example

  19. Evaluation of the uncertainties associated to the in vivo monitoring of iodine-131 in the thyroid

    Gontijo, Rodrigo Modesto Gadelha; Lucena, Eder Augusto; Dantas, Ana Leticia A.; Dantas, Bernardo Maranhao [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    The internal dose from the incorporation of radionuclides by humans can be estimated by in vivo direct measurements in the human body and in vitro analysis of biological indicators. In vivo techniques consist on the identification and quantification of radionuclides present in the whole body and in specific organs and tissues. The results obtained in measurements may present small uncertainties which are within pre-set limits in monitoring programs for occupationally exposed individuals. This study aims to evaluate the sources of uncertainty associated with the results of in vivo monitoring of iodine 131 in the thyroid. The benchmarks adopted in this study are based on the criteria suggested by the General Guide for Estimating Effective Doses from Monitoring Data (Project IDEAS/European Community). The reference values used were the ones for high-energy photons (>100 keV). Besides the parameters suggested by the IDEAS Guide, it has also been evaluated the fluctuation of the counting due to the phantom repositioning, which represents the reproducibility of the counting geometry. Measurements were performed at the Whole Body Counter Unit of the IRD using a scintillation detector NaI (Tl) 3'' x 3'' and a neck-thyroid phantom developed at the In Vivo Monitoring Laboratory of IRD. This phantom contains a standard source of barium-133 added to a piece of filter paper with the dimension and shape of a thyroid gland. Scattering factors were calculated and compared in different counting geometries. The results show that the technique studied presents reproducibility equivalent to the values suggested in the IDEAS Guide and measurement uncertainties compatible to international quality standards for this type of in vivo monitoring. (author)

  20. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  1. Probability and Confidence Trade-space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    Anderson, Leif; Box, Neil; Carter, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    There are two general shortcomings to the current annual sparing assessment: 1. The vehicle functions are currently assessed according to confidence targets, which can be misleading- overly conservative or optimistic. 2. The current confidence levels are arbitrarily determined and do not account for epistemic uncertainty (lack of knowledge) in the ORU failure rate. There are two major categories of uncertainty that impact Sparing Assessment: (a) Aleatory Uncertainty: Natural variability in distribution of actual failures around an Mean Time Between Failure (MTBF) (b) Epistemic Uncertainty : Lack of knowledge about the true value of an Orbital Replacement Unit's (ORU) MTBF We propose an approach to revise confidence targets and account for both categories of uncertainty, an approach we call Probability and Confidence Trade-space (PACT) evaluation.

  2. Evaluation and uncertainty estimates of Charpy-impact data

    Stallman, F.W.

    1982-01-01

    Shifts in transition temperature and upper-shelf energy from Charpy tests are used to determine the extent of radiation embrittlement in steels. In order to determine these parameters reliably and to obtain uncertainty estimates, curve fitting procedures need to be used. The hyperbolic tangent or similar models have been proposed to fit the temperature-impact-energy curve. These models are not based on the actual fracture mechanics and are indeed poorly suited in many applications. The results may be falsified by forcing an inflexible curve through too many data points. The nonlinearity of the fit poses additional problems. In this paper, a simple linear fit is proposed. By eliminating data which are irrelevant for the determination of a given parameter, better reliability and accuracy can be achieved. Additional input parameters like fluence and irradiation temperature can be included. This is important if there is a large variation of fluence and temperature in different test specimens. The method has been tested with Charpy specimens from the NRC-HSST experiments

  3. Evaluation of Spatial Uncertainties In Modeling of Cadastral Systems

    Fathi, Morteza; Teymurian, Farideh

    2013-04-01

    Cadastre plays an essential role in sustainable development especially in developing countries like Iran. A well-developed Cadastre results in transparency of estates tax system, transparency of data of estate, reduction of action before the courts and effective management of estates and natural sources and environment. Multipurpose Cadastre through gathering of other related data has a vital role in civil, economic and social programs and projects. Iran is being performed Cadastre for many years but success in this program is subject to correct geometric and descriptive data of estates. Since there are various sources of data with different accuracy and precision in Iran, some difficulties and uncertainties are existed in modeling of geometric part of Cadastre such as inconsistency between data in deeds and Cadastral map which cause some troubles in execution of cadastre and result in losing national and natural source, rights of nation. Now there is no uniform and effective technical method for resolving such conflicts. This article describes various aspects of such conflicts in geometric part of cadastre and suggests a solution through some modeling tools of GIS.

  4. Degradation and performance evaluation of PV module in desert climate conditions with estimate uncertainty in measuring

    Fezzani Amor

    2017-01-01

    Full Text Available The performance of photovoltaic (PV module is affected by outdoor conditions. Outdoor testing consists installing a module, and collecting electrical performance data and climatic data over a certain period of time. It can also include the study of long-term performance under real work conditions. Tests are operated in URAER located in desert region of Ghardaïa (Algeria characterized by high irradiation and temperature levels. The degradation of PV module with temperature and time exposure to sunlight contributes significantly to the final output from the module, as the output reduces each year. This paper presents a comparative study of different methods to evaluate the degradation of PV module after a long term exposure of more than 12 years in desert region and calculates uncertainties in measuring. Firstly, this evaluation uses three methods: Visual inspection, data given by Solmetric PVA-600 Analyzer translated at Standard Test Condition (STC and based on the investigation results of the translation equations as ICE 60891. Secondly, the degradation rates calculated for all methods. Finally, a comparison between a degradation rates given by Solmetric PVA-600 analyzer, calculated by simulation model and calculated by two methods (ICE 60891 procedures 1, 2. We achieved a detailed uncertainty study in order to improve the procedure and measurement instrument.

  5. Uncertainty evaluation in the chloroquine phosphate potentiometric titration: application of three different approaches.

    Rodomonte, Andrea Luca; Montinaro, Annalisa; Bartolomei, Monica

    2006-09-11

    A measurement result cannot be properly interpreted if not accompanied by its uncertainty. Several methods to estimate uncertainty have been developed. From those methods three in particular were chosen in this work to estimate the uncertainty of the Eu. Ph. chloroquine phosphate assay, a potentiometric titration commonly used in medicinal control laboratories. The famous error-budget approach (also called bottom-up or step-by-step) described by the ISO Guide to the expression of Uncertainty in Measurement (GUM) was the first method chosen. It is based on the combination of uncertainty contributions that have to be directly derived from the measurement process. The second method employed was the Analytical Method Committee top-down which estimates uncertainty through reproducibility obtained during inter-laboratory studies. Data for its application were collected in a proficiency testing study carried out by over 50 laboratories throughout Europe. The last method chosen was the one proposed by Barwick and Ellison. It uses a combination of precision, trueness and ruggedness data to estimate uncertainty. These data were collected from a validation process specifically designed for uncertainty estimation. All the three approaches presented a distinctive set of advantages and drawbacks in their implementation. An expanded uncertainty of about 1% was assessed for the assay investigated.

  6. Comparison of first order analysis and Monte Carlo methods in evaluating groundwater model uncertainty: a case study from an iron ore mine in the Pilbara Region of Western Australia

    Firmani, G.; Matta, J.

    2012-04-01

    The expansion of mining in the Pilbara region of Western Australia is resulting in the need to develop better water strategies to make below water table resources accessible, manage surplus water and deal with water demands for processing ore and construction. In all these instances, understanding the local and regional hydrogeology is fundamental to allow sustainable mining; minimising the impacts to the environment. An understanding of the uncertainties of the hydrogeology is necessary to quantify the risks and make objective decisions rather than relying on subjective judgements. The aim of this paper is to review some of the methods proposed by the published literature and find approaches that can be practically implemented in an attempt to estimate model uncertainties. In particular, this paper adopts two general probabilistic approaches that address the parametric uncertainty estimation and its propagation in predictive scenarios: the first order analysis and Monte Carlo simulations. A case example application of the two techniques is also presented for the dewatering strategy of a large below water table open cut iron ore mine in the Pilbara region of Western Australia. This study demonstrates the weakness of the deterministic approach, as the coefficients of variation of some model parameters were greater than 1.0; and suggests a review of the model calibration method and conceptualisation. The uncertainty propagation into predictive scenarios was calculated assuming the parameters with a coefficient of variation higher than 0.25 as deterministic, due to computational difficulties to achieve an accurate result with the Monte Carlo method. The conclusion of this case study was that the first order analysis appears to be a successful and simple tool when the coefficients of variation of calibrated parameters are less than 0.25.

  7. Evaluation on uncertainty sources in projecting hydrological changes over the Xijiang River basin in South China

    Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren

    2017-11-01

    Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang

  8. Evaluation method for uncertainty of effective delayed neutron fraction βeff

    Zukeran, Atsushi

    1999-01-01

    Uncertainty of effective delayed neutron fraction β eff is evaluated in terms of three quantities; uncertainties of the basic delayed neutron constants, energy dependence of delayed neutron yield ν d m , and the uncertainties of the fission cross sections of fuel elements. The uncertainty of β eff due to the delayed neutron yield is expressed by a linearized formula assuming that the delayed neutron yield does not depend on the incident energy, and the energy dependence is supplemented by using the detailed energy dependence proposed by D'Angelo and Filip. The third quantity, uncertainties of fission cross section, is evaluated on the basis of the generalized perturbation theory in relation to reaction rate rations such as central spectral indexes or average reaction rate ratios. Resultant uncertainty of β eff is about 4 to 5%s, in which primary factor is the delayed neutron yield, and the secondary one is the fission cross section uncertainty, especially for 238 U. The energy dependence of ν d m systematically reduces the magnitude of β eff about 1.4% to 1.7%, depending on the model of the energy vs. ν d m correlation curve. (author)

  9. Value assignment and uncertainty evaluation for single-element reference solutions

    Possolo, Antonio; Bodnar, Olha; Butler, Therese A.; Molloy, John L.; Winchester, Michael R.

    2018-06-01

    A Bayesian statistical procedure is proposed for value assignment and uncertainty evaluation for the mass fraction of the elemental analytes in single-element solutions distributed as NIST standard reference materials. The principal novelty that we describe is the use of information about relative differences observed historically between the measured values obtained via gravimetry and via high-performance inductively coupled plasma optical emission spectrometry, to quantify the uncertainty component attributable to between-method differences. This information is encapsulated in a prior probability distribution for the between-method uncertainty component, and it is then used, together with the information provided by current measurement data, to produce a probability distribution for the value of the measurand from which an estimate and evaluation of uncertainty are extracted using established statistical procedures.

  10. GRS Method for Uncertainty and Sensitivity Evaluation of Code Results and Applications

    Glaeser, H.

    2008-01-01

    During the recent years, an increasing interest in computational reactor safety analysis is to replace the conservative evaluation model calculations by best estimate calculations supplemented by uncertainty analysis of the code results. The evaluation of the margin to acceptance criteria, for example, the maximum fuel rod clad temperature, should be based on the upper limit of the calculated uncertainty range. Uncertainty analysis is needed if useful conclusions are to be obtained from best estimate thermal-hydraulic code calculations, otherwise single values of unknown accuracy would be presented for comparison with regulatory acceptance limits. Methods have been developed and presented to quantify the uncertainty of computer code results. The basic techniques proposed by GRS are presented together with applications to a large break loss of coolant accident on a reference reactor as well as on an experiment simulating containment behaviour

  11. Dealing with uncertainties in the context of post mining hazard evaluation

    Cauvin , Maxime; Salmon , Romuald; Verdel , Thierry

    2008-01-01

    International audience; Risk analyses related to a past mining activity are generally performed in a strong context in uncertainties. A PhD Thesis has been undertaken in 2004 in order to draw up solutions to take into account these uncertainties in the practice. The possibility of elaborating a more quantified evaluation of risk has also been discussed, and in particular the contribution that probabilistic methods may brought to an analysis. This paper summarizes the main results of the Thesi...

  12. The uncertainty evaluation of measurement for uranium in UF_6 hydrolysate by potentiometric titration

    Jiang Haiying; Cheng Ruoyu; Meng Xiujun

    2014-01-01

    Based on the building of mathematical model, this paper analyzed the origin of component of indeterminacy of which the measurement result for uranium in uranium hexafluoride hydrolysate by potentiometric titration, also each uncertainty was calculated and the expanded uncertainty was given. By evaluation the result of the uranium concentration is that: (158.88 + 1.22) mgU/mL, K = 2, P = 95%. (authors)

  13. Data uncertainty impact in radiotoxicity evaluation connected to EFR and IRF systems

    Palmiotti, G.; Salvatores, M.

    1993-01-01

    Time-dependent sensitivity techniques, which have been used in the past for standard reactor applications, have been adapted to calculate the impact of data uncertainties in radiotoxicity evaluations. The methodology has been applied to different strategies of radioactive waste management connected with the EFR and IFR reactor fuel cycles. Results are provided in terms of sensitivity coefficients to basic data (cross sections and decay constants), and uncertainties on global radiotoxicity at different times of storing after discharge

  14. Inclusion of geometric uncertainties in treatment plan evaluation

    van Herk, Marcel; Remeijer, Peter; Lebesque, Joos V.

    2002-01-01

    PURPOSE: To correctly evaluate realistic treatment plans in terms of absorbed dose to the clinical target volume (CTV), equivalent uniform dose (EUD), and tumor control probability (TCP) in the presence of execution (random) and preparation (systematic) geometric errors. MATERIALS AND METHODS: The

  15. Approach and methods to evaluate the uncertainty in system thermalhydraulic calculations

    D'Auria, F.

    2004-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate (BE) calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. In the present paper the approaches to uncertainty are outlined and the CIAU (Code with capability of Internal Assessment of Uncertainty) method proposed by the University of Pisa is described including ideas at the basis and results from applications. An activity in progress at the International Atomic Energy Agency (IAEA) is considered. Two approaches are distinguished that are characterized as 'propagation of code input uncertainty' and 'propagation of code output errors'. For both methods, the thermal-hydraulic code is at the centre of the process of uncertainty evaluation: in the former case the code itself is adopted to compute the error bands and to propagate the input errors, in the latter case the errors in code application to relevant measurements are used to derive the error bands. The CIAU method exploits the idea of the 'status approach' for identifying the thermalhydraulic conditions of an accident in any Nuclear Power Plant (NPP). Errors in predicting such status are derived from the comparison between predicted and measured quantities and, in the stage of the application of the method, are used to compute the uncertainty. (author)

  16. Licensing in BE system code calculations. Applications and uncertainty evaluation by CIAU method

    Petruzzi, Alessandro; D'Auria, Francesco

    2007-01-01

    The evaluation of uncertainty constitutes the necessary supplement of Best Estimate (BE) calculations performed to understand accident scenarios in water cooled nuclear reactors. The needs come from the imperfection of computational tools on the one side and from the interest in using such tool to get more precise evaluation of safety margins. In the present paper the approaches to uncertainty are outlined and the CIAU (Code with capability of Internal Assessment of Uncertainty) method proposed by the University of Pisa is described including ideas at the basis and results from applications. Two approaches are distinguished that are characterized as 'propagation of code input uncertainty' and 'propagation of code output errors'. For both methods, the thermal-hydraulic code is at the centre of the process of uncertainty evaluation: in the former case the code itself is adopted to compute the error bands and to propagate the input errors, in the latter case the errors in code application to relevant measurements are used to derive the error bands. The CIAU method exploits the idea of the 'status approach' for identifying the thermal-hydraulic conditions of an accident in any Nuclear Power Plant (NPP). Errors in predicting such status are derived from the comparison between predicted and measured quantities and, in the stage of the application of the method, are used to compute the uncertainty. (author)

  17. Uncertainty propagation in probabilistic risk assessment: A comparative study

    Ahmed, S.; Metcalf, D.R.; Pegram, J.W.

    1982-01-01

    Three uncertainty propagation techniques, namely method of moments, discrete probability distribution (DPD), and Monte Carlo simulation, generally used in probabilistic risk assessment, are compared and conclusions drawn in terms of the accuracy of the results. For small uncertainty in the basic event unavailabilities, the three methods give similar results. For large uncertainty, the method of moments is in error, and the appropriate method is to propagate uncertainty in the discrete form either by DPD method without sampling or by Monte Carlo. (orig.)

  18. Evaluation of Uncertainties in hydrogeological modeling and groundwater flow analyses. Model calibration

    Ijiri, Yuji; Ono, Makoto; Sugihara, Yutaka; Shimo, Michito; Yamamoto, Hajime; Fumimura, Kenichi

    2003-03-01

    This study involves evaluation of uncertainty in hydrogeological modeling and groundwater flow analysis. Three-dimensional groundwater flow in Shobasama site in Tono was analyzed using two continuum models and one discontinuous model. The domain of this study covered area of four kilometers in east-west direction and six kilometers in north-south direction. Moreover, for the purpose of evaluating how uncertainties included in modeling of hydrogeological structure and results of groundwater simulation decreased with progress of investigation research, updating and calibration of the models about several modeling techniques of hydrogeological structure and groundwater flow analysis techniques were carried out, based on the information and knowledge which were newly acquired. The acquired knowledge is as follows. As a result of setting parameters and structures in renewal of the models following to the circumstances by last year, there is no big difference to handling between modeling methods. The model calibration is performed by the method of matching numerical simulation with observation, about the pressure response caused by opening and closing of a packer in MIU-2 borehole. Each analysis technique attains reducing of residual sum of squares of observations and results of numerical simulation by adjusting hydrogeological parameters. However, each model adjusts different parameters as water conductivity, effective porosity, specific storage, and anisotropy. When calibrating models, sometimes it is impossible to explain the phenomena only by adjusting parameters. In such case, another investigation may be required to clarify details of hydrogeological structure more. As a result of comparing research from beginning to this year, the following conclusions are obtained about investigation. (1) The transient hydraulic data are effective means in reducing the uncertainty of hydrogeological structure. (2) Effective porosity for calculating pore water velocity of

  19. Applications of uncertainty analysis to visual evaluation of density in radiographs

    Uchida, Suguru; Ohtsuka, Akiyoshi; Fujita, Hiroshi.

    1981-01-01

    Uncertainty analysis, developed as a method of absolute judgment in psychology, is applied to a method of radiographic image evaluation with perceptual fluctuations and to an examination of visual evaluation of density in radiographs. Subjects are composed of three groups of four neurosurgeons, four radiologic technologists and four nonprofessionals. By using a five-category rating scale, each observer is directed to classify 255 radiographs randomly presented without feedback. Characteristics of each observer and each group can be shown quantitatively by calculated information values. It is also described that bivariate uncertainty analysis or entropy method can be used to calculate the degree of agreement of evaluation. (author)

  20. Applications of uncertainty analysis to visual evaluation of density in radiographs

    Uchida, S [Gifu Univ. (Japan); Ohtsuka, A; Fujita, H

    1981-03-01

    Uncertainty analysis, developed as a method of absolute judgment in psychology, is applied to a method of radiographic image evaluation with perceptual fluctuations and to an examination of visual evaluation of density in radiographs. Subjects are composed of three groups of four neurosurgeons, four radiologic technologists and four nonprofessionals. By using a five-category rating scale, each observer is directed to classify 255 radiographs randomly presented without feedback. Characteristics of each observer and each group can be shown quantitatively by calculated information values. It is also described that bivariate uncertainty analysis or entropy method can be used to calculate the degree of agreement of evaluation.

  1. New developments for determination of uncertainty in phase evaluation

    Liu, Sheng

    Phase evaluation exists mostly in, but not limited to, interferometric applications that utilize coherent multidimensional signals to modulate the physical quantity of interest into a nonlinear form, represented by repeating the phase modulo of 271 radians. In order to estimate the underlying physical quantity, the wrapped phase has to be unwrapped by an evaluation procedure which is usually called phase unwrapping. The procedure of phase unwrapping will obviously face the challenge of inconsistent phase, which could bring errors in phase evaluation. The main objectives of this research include addressing the problem of inconsistent phase in phase unwrapping and applications in modern optical techniques. In this research, a new phase unwrapping algorithm is developed. The creative idea of doing phase unwrapping between regions has an advantage over conventional pixel-to-pixel unwrapping methods because the unwrapping result is more consistent by using a voting mechanism based on all Zit-discontinuities hints. Furthermore, a systematic sequence of regional unwrapping is constructed in order to achieve a global consistent result. An implementation of the idea is illustrated in dct.il with step-by-step pseudo codes. The performance of the algorithm is demonstrated on real world applications. In order to solve a phase unwrapping problem which is caused by depth discontinuities in 3D shape measurement, a new absolute phase coding strategy is developed. The algorithm presented has two merits: effectively extends the coding range and preserves the measurement sensitivity. The performance of the proposed absolute coding strategy is proved by results of 3D shape measurement for objects with surface discontinuities. As a powerful tool for real world applications a universal software package, Optical Measurement and Evaluation Software (OMES), is designed for the purposes of automatic measurement and quantitative evaluation in 3D shape measurement and laser interferometry

  2. Uncertainty and conservatism in safety evaluations based on a BEPU approach

    Yamaguchi, A.; Mizokami, S.; Kudo, Y.; Hotta, A.

    2009-01-01

    Atomic Energy Society of Japan has published 'Standard Method for Safety Evaluation using Best Estimate Code Based on Uncertainty and Scaling Analyses with Statistical Approach' to be applied to accidents and AOOs in the safety evaluation of LWRs. In this method, hereafter named as the AESJ-SSE (Statistical Safety Evaluation) method, identification and quantification of uncertainties will be performed and then a combination of the best estimate code and the evaluation of uncertainty propagation will be performed. Uncertainties are categorized into bias and variability. In general, bias is related to our state-of-knowledge on uncertainty objects (modeling, scaling, input data, etc.) while variability reflects stochastic features involved in these objects. Considering many kinds of uncertainties in thermal-hydraulics models and experimental databases show variabilities that will be strongly influenced by our state of knowledge, it seems reasonable that these variabilities are also related to state-of-knowledge. The design basis events (DBEs) that are employed for licensing analyses form a main part of the given or prior conservatism. The regulatory acceptance criterion is also regarded as the prior conservatism. In addition to these prior conservatisms, a certain amount of the posterior conservatism is added with maintaining intimate relationships with state-of-knowledge. In the AESJ-SSE method, this posterior conservatism can be incorporated into the safety evaluation in a combination of the following three ways, (1) broadening ranges of variability relevant to uncertainty objects, (2) employing more disadvantageous biases relevant to uncertainty objects and (3) adding an extra bias to the safety evaluation results. Knowing implemented quantitative bases of uncertainties and conservatism, the AESJ-SSE method provides a useful ground for rational decision-making. In order to seek for 'the best estimation' as well as reasonably setting the analytical margin, a degree

  3. Rational consensus under uncertainty: Expert judgment in the EC-USNRC uncertainty study

    Cooke, R.; Kraan, B.; Goossens, L.

    1999-01-01

    Governmental bodies are confronted with the problem of achieving rational consensus in the face of substantial uncertainties. The area of accident consequence management for nuclear power plants affords a good example. Decisions with regard to evacuation, decontamination, and food bans must be taken on the basis of predictions of environmental transport of radioactive material, contamination through the food chain, cancer induction, and the like. These predictions use mathematical models containing scores of uncertain parameters. Decision makers want to take, and want to be perceived to take, these decisions in a rational manner. The question is, how can this be accomplished in the face of large uncertainties? Indeed, the very presence of uncertainty poses a threat to rational consensus. Decision makers will necessarily base their actions on the judgments of experts. The experts, however, will not agree among themselves, as otherwise we would not speak of large uncertainties. Any given expert's viewpoint will be favorable to the interests of some stakeholders, and hostile to the interests of others. If a decision maker bases his/her actions on the views of one single expert, then (s)he is invariably open to charges of partiality toward the interests favored by this viewpoint. An appeal to 'impartial' or 'disinterested' experts will fail for two reasons. First, experts have interests; they have jobs, mortgages and professional reputations. Second, even if expert interests could somehow be quarantined, even then the experts would disagree. Expert disagreement is not explained by diverging interests, and consensus cannot be reached by shielding the decision process from expert interests. If rational consensus requires expert agreement, then rational consensus is simply not possible in the face of uncertainty. If rational consensus under uncertainty is to be achieved, then evidently the views of a diverse set of experts must be taken into account. The question is how

  4. The grey relational approach for evaluating measurement uncertainty with poor information

    Luo, Zai; Wang, Yanqing; Zhou, Weihu; Wang, Zhongyu

    2015-01-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) is the master document for measurement uncertainty evaluation. However, the GUM may encounter problems and does not work well when the measurement data have poor information. In most cases, poor information means a small data sample and an unknown probability distribution. In these cases, the evaluation of measurement uncertainty has become a bottleneck in practical measurement. To solve this problem, a novel method called the grey relational approach (GRA), different from the statistical theory, is proposed in this paper. The GRA does not require a large sample size or probability distribution information of the measurement data. Mathematically, the GRA can be divided into three parts. Firstly, according to grey relational analysis, the grey relational coefficients between the ideal and the practical measurement output series are obtained. Secondly, the weighted coefficients and the measurement expectation function will be acquired based on the grey relational coefficients. Finally, the measurement uncertainty is evaluated based on grey modeling. In order to validate the performance of this method, simulation experiments were performed and the evaluation results show that the GRA can keep the average error around 5%. Besides, the GRA was also compared with the grey method, the Bessel method, and the Monte Carlo method by a real stress measurement. Both the simulation experiments and real measurement show that the GRA is appropriate and effective to evaluate the measurement uncertainty with poor information. (paper)

  5. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  6. A Study on the uncertainty and sensitivity in numerical simulation of parametric roll

    Choi, Ju-hyuck; Nielsen, Ulrik Dam; Jensen, Jørgen Juncher

    2016-01-01

    Uncertainties related to numerical modelling of parametric roll have been investigated by using a 6-DOFs model with nonlinear damping and roll restoring forces. At first, uncertainty on damping coefficients and its effect on the roll response is evaluated. Secondly, uncertainty due to the “effect...

  7. Parameter uncertainty in simulations of extreme precipitation and attribution studies.

    Timmermans, B.; Collins, W. D.; O'Brien, T. A.; Risser, M. D.

    2017-12-01

    The attribution of extreme weather events, such as heavy rainfall, to anthropogenic influence involves the analysis of their probability in simulations of climate. The climate models used however, such as the Community Atmosphere Model (CAM), employ approximate physics that gives rise to "parameter uncertainty"—uncertainty about the most accurate or optimal values of numerical parameters within the model. In particular, approximate parameterisations for convective processes are well known to be influential in the simulation of precipitation extremes. Towards examining the impact of this source of uncertainty on attribution studies, we investigate the importance of components—through their associated tuning parameters—of parameterisations relating to deep and shallow convection, and cloud and aerosol microphysics in CAM. We hypothesise that as numerical resolution is increased the change in proportion of variance induced by perturbed parameters associated with the respective components is consistent with the decreasing applicability of the underlying hydrostatic assumptions. For example, that the relative influence of deep convection should diminish as resolution approaches that where convection can be resolved numerically ( 10 km). We quantify the relationship between the relative proportion of variance induced and numerical resolution by conducting computer experiments that examine precipitation extremes over the contiguous U.S. In order to mitigate the enormous computational burden of running ensembles of long climate simulations, we use variable-resolution CAM and employ both extreme value theory and surrogate modelling techniques ("emulators"). We discuss the implications of the relationship between parameterised convective processes and resolution both in the context of attribution studies and progression towards models that fully resolve convection.

  8. Uncertainty Evaluation of the Thermal Expansion of Gd2O3-ZrO2 with a System Calibration Factor

    Park, Chang Je; Kang, Kweon Ho; Na, Sang Ho; Song, Kee Chan

    2007-01-01

    Both gadolinia (Gd 2 O 3 ) and zirconia (ZrO 2 ) are widely used in the nuclear industry, including a burnable absorber and additives in the fabrication of a simulated fuel. Thermal expansions of a mixture of gadolinia (Gd 2 O 3 ) 20 mol% and zirconia (ZrO 2 ) 80 mol% were measured by using a dilatometer (DIL402C) from room temperature to 1500 .deg. C. Uncertainties in the measurement should be quantified based on statistics. Referring to the ISO (International Organization for Standardization) guide, the uncertainties of the thermal expansion were quantified for three parts - the initial length, the length variation, and the system calibration factor. The whole system, the dilatometer, is composed of many complex sub-systems and in fact it is difficult to consider all the uncertainties of the sub-systems. Thus, the system calibration factor was introduced with a standard material for the uncertainty evaluation. In this study, a new system calibration factor was formulated in a multiplicative way. Further, the effect of calibration factor with random deviation was investigated for the uncertainty evaluation of a thermal expansion

  9. Fast evaluation of theoretical uncertainties with Sherpa and MCgrid

    Bothmann, Enrico; Schumann, Steffen [II. Physikalisches Institut, Georg-August-Universitaet Goettingen (Germany); Schoenherr, Marek [Physik-Institut, Universitaet Zuerich (Switzerland)

    2016-07-01

    The determination of theoretical error estimates and PDF/α{sub s}-fits requires fast evaluations of cross sections for varied QCD input parameters. These include PDFs, the strong coupling constant α{sub S} and the renormalization and factorization scales. Beyond leading order QCD, a full dedicated calculation for each set of parameters is often too time-consuming, certainly when performing PDF-fits. In this talk we discuss two methods to overcome this issue for any QCD NLO calculation: The novel event-reweighting feature in Sherpa and the automated generation of interpolations grids using the recently introduced MCgrid interface. For the Sherpa event-reweighting we present the newly added support for the all-order PDF dependencies of parton shower emissions. Building on that we discuss the sensitivity of high precision observables to those dependencies.

  10. MECCA coordinated research program: analysis of climate models uncertainties used for climatic changes study

    Caneill, J.Y.; Hakkarinen, C.

    1992-01-01

    An international consortium, called MECCA, (Model Evaluation Consortium for Climate Assessment) has been created in 1991 by different partners including electric utilities, government and academic groups to make available to the international scientific community, a super-computer facility for climate evolution studies. The first phase of the program consists to assess uncertainties of climate model simulations in the framework of global climate change studies. Fourteen scientific projects have been accepted on an international basis in this first phase. The second phase of the program will consist in the evaluation of a set of long climate simulations realized with coupled ocean/atmosphere models, in order to study the transient aspects of climate changes and the associated uncertainties. A particular attention will be devoted, on the consequences of these assessments on climate impact studies, and on the regional aspects of climate changes

  11. Comparison of ISO-GUM and Monte Carlo Method for Evaluation of Measurement Uncertainty

    Ha, Young-Cheol; Her, Jae-Young; Lee, Seung-Jun; Lee, Kang-Jin [Korea Gas Corporation, Daegu (Korea, Republic of)

    2014-07-15

    To supplement the ISO-GUM method for the evaluation of measurement uncertainty, a simulation program using the Monte Carlo method (MCM) was developed, and the MCM and GUM methods were compared. The results are as follows: (1) Even under a non-normal probability distribution of the measurement, MCM provides an accurate coverage interval; (2) Even if a probability distribution that emerged from combining a few non-normal distributions looks as normal, there are cases in which the actual distribution is not normal and the non-normality can be determined by the probability distribution of the combined variance; and (3) If type-A standard uncertainties are involved in the evaluation of measurement uncertainty, GUM generally offers an under-valued coverage interval. However, this problem can be solved by the Bayesian evaluation of type-A standard uncertainty. In this case, the effective degree of freedom for the combined variance is not required in the evaluation of expanded uncertainty, and the appropriate coverage factor for 95% level of confidence was determined to be 1.96.

  12. Uncertainty evaluation in correlated quantities: application to elemental analysis of atmospheric aerosols

    Espinosa, A.; Miranda, J.; Pineda, J. C.

    2010-01-01

    One of the aspects that are frequently overlooked in the evaluation of uncertainty in experimental data is the possibility that the involved quantities are correlated among them, due to different causes. An example in the elemental analysis of atmospheric aerosols using techniques like X-ray Fluorescence (X RF) or Particle Induced X-ray Emission (PIXE). In these cases, the measured elemental concentrations are highly correlated, and then are used to obtain information about other variables, such as the contribution from emitting sources related to soil, sulfate, non-soil potassium or organic matter. This work describes, as an example, the method required to evaluate the uncertainty in variables determined from correlated quantities from a set of atmospheric aerosol samples collected in the Metropolitan Area of the Mexico Valley and analyzed with PIXE. The work is based on the recommendations of the Guide for the Evaluation of Uncertainty published by the International Organization for Standardization. (Author)

  13. An Evaluation of Uncertainty Associated to Analytical Measurements of Selected Polycyclic Aromatic Compounds in Ambient Air

    Barrado, A. I.; Garcia, S.; Perez, R. M.

    2013-01-01

    This paper presents an evaluation of uncertainty associated to analytical measurement of eighteen polycyclic aromatic compounds (PACs) in ambient air by liquid chromatography with fluorescence detection (HPLC/FD). The study was focused on analyses of PM 1 0, PM 2 .5 and gas phase fractions. Main analytical uncertainty was estimated for eleven polycyclic aromatic hydrocarbons (PAHs), four nitro polycyclic aromatic hydrocarbons (nitro-PAHs) and two hydroxy polycyclic aromatic hydrocarbons (OH-PAHs) based on the analytical determination, reference material analysis and extraction step. Main contributions reached 15-30% and came from extraction process of real ambient samples, being those for nitro- PAHs the highest (20-30%). Range and mean concentration of PAC mass concentrations measured in gas phase and PM 1 0/PM 2 .5 particle fractions during a full year are also presented. Concentrations of OH-PAHs were about 2-4 orders of magnitude lower than their parent PAHs and comparable to those sparsely reported in literature. (Author)

  14. A systematic approach to the modelling of measurements for uncertainty evaluation

    Sommer, K D; Weckenmann, A; Siebert, B R L

    2005-01-01

    The evaluation of measurement uncertainty is based on both, the knowledge about the measuring process and the quantities which influence the measurement result. The knowledge about the measuring process is represented by the model equation which expresses the interrelation between the measurand and the input quantities. Therefore, the modelling of the measurement is a key element of modern uncertainty evaluation. A modelling concept has been developed that is based on the idea of the measuring chain. It gets on with only a few generic model structures. From this concept, a practical stepwise procedure has been derived

  15. Observational uncertainty and regional climate model evaluation: A pan-European perspective

    Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella

    2017-04-01

    Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For

  16. Evaluating Sources of Risks in Large Engineering Projects: The Roles of Equivocality and Uncertainty

    Leena Pekkinen

    2015-11-01

    Full Text Available Contemporary project risk management literature introduces uncertainty, i.e., the lack of information, as a fundamental basis of project risks. In this study the authors assert that equivocality, i.e., the existence of multiple and conflicting interpretations, can also serve as a basis of risks. With an in-depth empirical investigation of a large complex engineering project the authors identified risk sources having their bases in the situations where uncertainty or equivocality was the predominant attribute. The information processing theory proposes different managerial practices for risk management based on the sources of risks in uncertainty or equivocality.

  17. Evaluating the Impact of Contaminant Dilution and Biodegradation in Uncertainty Quantification of Human Health Risk

    Zarlenga, Antonio; de Barros, Felipe; Fiori, Aldo

    2016-04-01

    covariance shape, reaction parameters pertaining to aerobic and anaerobic degradation processes respectively as well as the dose response parameters. Even though the final result assumes a relatively simple form, few numerical quadratures are required in order to evaluate the trajectory moments of the solute plume. In order to perform a sensitivity analysis we apply the methodology to a hypothetical case study. The scenario investigated is made by an aquifer which constitutes a water supply for a population where a continuous source of NAPL contaminant feeds a steady plume. The risk analysis is limited to carcinogenic compounds for which the well-known linear relation for human risk is assumed. Analysis performed shows few interesting findings: the risk distribution is strictly dependent on the pore scale dynamics that trigger dilution and mixing; biodegradation may involve a significant reduction of the risk.

  18. Evaluation of uncertainty associated with parameters for long-term safety assessments of geological disposal

    Yamaguchi, Tetsuji; Minase, Naofumi; Iida, Yoshihisa; Tanaka, Tadao; Nakayama, Shinichi

    2005-01-01

    This paper describes the current status of our data acquisition on quantifying uncertainties associated with parameters for safety assessment on groundwater scenarios for geological disposal of radioactive wastes. First, sources of uncertainties and the resulting priority in data acquisition were briefed. Then, the current status of data acquisition for quantifying the uncertainties in assessing solubility, diffusivity in bentonite buffer and distribution coefficient on rocks is introduced. The uncertainty with the solubility estimation is quantified from that associated with thermodynamic data and that in estimating groundwater chemistry. The uncertainty associated with the diffusivity in bentonite buffer is composed of variations of relevant factors such as porosity of the bentonite buffer, montmorillonite content, chemical composition of pore water and temperature. The uncertainty of factors such as the specific surface area of the rock, pH, ionic strength, carbonate concentration in groundwater compose uncertainty of the distribution coefficient of radionuclides on rocks. Based on these investigations, problems to be solved in future studies are summarized. (author)

  19. Replication quality assessment and uncertainty evaluation of a polymer precision injection moulded component

    Baruffi, Federico; Calaon, Matteo; Tosello, Guido

    2017-01-01

    Precision injection moulding holds a central role in manufacturing as only replication process currently capable of accurately producing complex shaped polymer parts integrating micrometric features on a mass scale production. In this scenario, a study on the replication quality of a polymer...... injection moulded precision component for telecommunication applications is presented. The effects of the process parameters on the component dimensional variation have been investigated using a statistical approach. Replication fidelity of produced parts has been assessed using a focus variation microscope...... with sub-micrometric resolution. Measurement uncertainty has then been evaluated, according to the GUM considering contributions from different process settings combinations and mould geometries. The analysis showed that the injection moulding manufacturing process and the utilized measurement chain...

  20. Application of probabilistic modelling for the uncertainty evaluation of alignment measurements of large accelerator magnets assemblies

    Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.

    2018-05-01

    Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.

  1. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  2. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  3. Uncertainty sources in radiopharmaceuticals clinical studies; Fontes de incertezas em estudos clinicos com radiofarmacos

    Degenhardt, Aemilie Louize; Oliveira, Silvia Maria Velasques de, E-mail: silvia@cnen.gov.br, E-mail: amilie@bolsista.ird.gov.br [Instituto de Radioprotecao e Dosimetria, (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2014-07-01

    The radiopharmaceuticals should be approved for consumption by evaluating their quality, safety and efficacy. Clinical studies are designed to verify the pharmacodynamics, pharmacological and clinical effects in humans and are required for assuring safety and efficacy. The Bayesian analysis has been used for clinical studies effectiveness evaluation. This work aims to identify uncertainties associated with the process of production of the radionuclide and radiopharmaceutical labelling as well as the radiopharmaceutical administration and scintigraphy images acquisition and processing. For the development of clinical studies in the country, the metrological chain shall assure the traceability of the surveys performed in all phases. (author)

  4. Uncertainty-driven nuclear data evaluation including thermal (n,α) applied to 59Ni

    Helgesson, P.; Sjöstrand, H.; Rochman, D.

    2017-11-01

    This paper presents a novel approach to the evaluation of nuclear data (ND), combining experimental data for thermal cross sections with resonance parameters and nuclear reaction modeling. The method involves sampling of various uncertain parameters, in particular uncertain components in experimental setups, and provides extensive covariance information, including consistent cross-channel correlations over the whole energy spectrum. The method is developed for, and applied to, 59Ni, but may be used as a whole, or in part, for other nuclides. 59Ni is particularly interesting since a substantial amount of 59Ni is produced in thermal nuclear reactors by neutron capture in 58Ni and since it has a non-threshold (n,α) cross section. Therefore, 59Ni gives a very important contribution to the helium production in stainless steel in a thermal reactor. However, current evaluated ND libraries contain old information for 59Ni, without any uncertainty information. The work includes a study of thermal cross section experiments and a novel combination of this experimental information, giving the full multivariate distribution of the thermal cross sections. In particular, the thermal (n,α) cross section is found to be 12.7 ± . 7 b. This is consistent with, but yet different from, current established values. Further, the distribution of thermal cross sections is combined with reported resonance parameters, and with TENDL-2015 data, to provide full random ENDF files; all of this is done in a novel way, keeping uncertainties and correlations in mind. The random files are also condensed into one single ENDF file with covariance information, which is now part of a beta version of JEFF 3.3. Finally, the random ENDF files have been processed and used in an MCNP model to study the helium production in stainless steel. The increase in the (n,α) rate due to 59Ni compared to fresh stainless steel is found to be a factor of 5.2 at a certain time in the reactor vessel, with a relative

  5. Uncertainty and sensitivity studies supporting the interpretation of the results of TVO I/II PRA

    Holmberg, J.

    1992-01-01

    A comprehensive Level 1 probabilistic risk assessment (PRA) has been performed for the TVO I/II nuclear power units. As a part of the PRA project, uncertainties of risk models and methods were systematically studied in order to describe them and to demonstrate their impact by way of results. The uncertainty study was divided into two phases: a qualitative and a quantitative study. The qualitative study contained identification of uncertainties and qualitative assessments of their importance. The PRA was introduced, and identified assumptions and uncertainties behind the models were documented. The most significant uncertainties were selected by importance measures or other judgements for further quantitative studies. The quantitative study included sensitivity studies and propagation of uncertainty ranges. In the sensitivity studies uncertain assumptions or parameters were varied in order to illustrate the sensitivity of the models. The propagation of the uncertainty ranges demonstrated the impact of the statistical uncertainties of the parameter values. The Monte Carlo method was used as a propagation method. The most significant uncertainties were those involved in modelling human interactions, dependences and common cause failures (CCFs), loss of coolant accident (LOCA) frequencies and pressure suppression. The qualitative mapping out of the uncertainty factors turned out to be useful in planning quantitative studies. It also served as internal review of the assumptions made in the PRA. The sensitivity studies were perhaps the most advantageous part of the quantitative study because they allowed individual analyses of the significance of uncertainty sources identified. The uncertainty study was found reasonable in systematically and critically assessing uncertainties in a risk analysis. The usefulness of this study depends on the decision maker (power company) since uncertainty studies are primarily carried out to support decision making when uncertainties are

  6. Incorporating forecast uncertainties into EENS for wind turbine studies

    Toh, G.K.; Gooi, H.B. [School of EEE, Nanyang Technological University, Singapore 639798 (Singapore)

    2011-02-15

    The rapid increase in wind power generation around the world has stimulated the development of applicable technologies to model the uncertainties of wind power resulting from the stochastic nature of wind and fluctuations of demand for integration of wind turbine generators (WTGs). In this paper the load and wind power forecast errors are integrated into the expected energy not served (EENS) formulation through determination of probabilities using the normal distribution approach. The effects of forecast errors and wind energy penetration in the power system are traversed. The impact of wind energy penetration on system reliability, total cost for energy and reserve procurement is then studied for a conventional power system. The results show a degradation of system reliability with significant wind energy penetration in the generation system. This work provides a useful insight into system reliability and economics for the independent system operator (ISO) to deploy energy/reserve providers when WTGs are integrated into the existing power system. (author)

  7. Uncertainties in the fate of nitrogen I: An overview of sources of uncertainty illustrated with a Dutch case study

    Kroeze, C.; Aerts, R.; Breemen, van N.; Dam, van D.; Hoek, van der K.; Hofschreuder, P.; Hoosbeek, M.R.; Klein, de J.; Kros, H.; Oene, van H.; Oenema, O.; Tietema, A.; Veeren, van der R.; Verhoeven, H.; Vries, de W.

    2003-01-01

    This study focuses on the uncertainties in the fate of nitrogen (N) in the Netherlands. Nitrogen inputs into the Netherlands in products, by rivers, and by atmospheric deposition, and microbial and industrial fixation of atmospheric N2 amount to about 4450 Gg N y¿1. About 60% of this N is

  8. Comparative study of the uncertainties in parton distribution functions

    Alekhin, S.I.

    2003-01-01

    Comparison of the methods used to extract the uncertainties in parton distributions is given, including their statistical properties and practical issues of implementation. Advantages and disadvantages of different methods are illustrated using the examples based on the analysis of real data. Available PDFs sets with associated uncertainties are reviewed and critically compared

  9. Annotated bibliography covering generation and use of evaluated cross section uncertainty files

    Peelle, R.W.; Burrows, T.W.

    1983-03-01

    Literature references related to definition, generation, and use of evaluated cross section uncertainty (variance-covariance) files are listed with comments intended primarily to guide the reader toward materials of immediate interest. Papers are also cited that cover covariance information for individual experiments and that relate to production and use of multigroup covariance matrices. Titles are divided among several major categories

  10. Supporting Sustainable Markets Through Life Cycle Assessment: Evaluating emerging technologies, incorporating uncertainty and the consumer perspective

    Merugula, Laura

    As civilization's collective knowledge grows, we are met with the realization that human-induced physical and biological transformations influenced by exogenous psychosocial and economic factors affect virtually every ecosystem on the planet. Despite improvements in energy generation and efficiencies, demand of material goods and energy services increases with no sign of a slowing pace. Sustainable development requires a multi-prong approach that involves reshaping demand, consumer education, sustainability-oriented policy, and supply chain management that does not serve the expansionist mentality. Thus, decision support tools are needed that inform developers, consumers, and policy-makers for short-term and long-term planning. These tools should incorporate uncertainty through quantitative methods as well as qualitatively informing the nature of the model as imperfect but necessary and adequate. A case study is presented of the manufacture and deployment of utility-scale wind turbines evaluated for a proposed change in blade manufacturing. It provides the first life cycle assessment (LCA) evaluating impact of carbon nanofibers, an emerging material, proposed for integration to wind power generation systems as blade reinforcement. Few LCAs of nanoproducts are available in scientific literature due to research and development (R&D) for applications that continues to outpace R&D for environmental, health, and safety (EHS) and life cycle impacts. LCAs of emerging technologies are crucial for informing developers of potential impacts, especially where market growth is swift and dissipative. A second case study is presented that evaluates consumer choice between disposable and reusable beverage cups. While there are a few studies that attempt to make the comparison using LCA, none adequately address uncertainty, nor are they representative for the typical American consumer. By disaggregating U.S. power generation into 26 subregional grid production mixes and evaluating

  11. Toward a definition of intolerance of uncertainty: a review of factor analytical studies of the Intolerance of Uncertainty Scale.

    Birrell, Jane; Meares, Kevin; Wilkinson, Andrew; Freeston, Mark

    2011-11-01

    Since its emergence in the early 1990s, a narrow but concentrated body of research has developed examining the role of intolerance of uncertainty (IU) in worry, and yet we still know little about its phenomenology. In an attempt to clarify our understanding of this construct, this paper traces the way in which our understanding and definition of IU have evolved throughout the literature. This paper also aims to further our understanding of IU by exploring the latent variables measures by the Intolerance of Uncertainty Scale (IUS; Freeston, Rheaume, Letarte, Dugas & Ladouceur, 1994). A review of the literature surrounding IU confirmed that the current definitions are categorical and lack specificity. A critical review of existing factor analytic studies was carried out in order to determine the underlying factors measured by the IUS. Systematic searches yielded 9 papers for review. Two factors with 12 consistent items emerged throughout the exploratory studies, and the stability of models containing these two factors was demonstrated in subsequent confirmatory studies. It is proposed that these factors represent (i) desire for predictability and an active engagement in seeking certainty, and (ii) paralysis of cognition and action in the face of uncertainty. It is suggested that these factors may represent approach and avoidance responses to uncertainty. Further research is required to confirm the construct validity of these factors and to determine the stability of this structure within clinical samples. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Evaluation of the uncertainty for the efficiency curve determination of 210Pb by liquid scintillation

    Sampaio, C.S.; Sousa, W.O.; Dantas, B.M.

    2014-01-01

    Methodologies for the evaluation of uncertainties associated with the determination of the efficiency curve of 210 Pb by liquid scintillation counting (LSC) are presented. No statistical difference were found when compared the uncertainties of the curves that represented the counting net before and after the secular equilibrium between 210 Pb and 210 Bi, nether when compared the curves when counting only 210 Pb and the curve with the total count of 210 Pb and 210 Bi, for the same time interval after precipitation. (author)

  13. Data uncertainty impact in radiotoxicity evaluation connected to EFR and IFR systems

    Palmiotti, G.; Salvatores, M.

    1993-01-01

    Time-dependent sensitivity techniques, which have been used in the past for standard reactor applications, have been adapted to calculate the impact of data uncertainties in radiotoxicity evaluations. The methodology has been applied to different strategies of radioactive waste management connected with the EFR and IFR reactor fuel cycles. Results are provided in terms of sensitivity coefficients to basic data (cross sections and decay constants), and uncertainties on global radiotoxicity at different times of storing after discharge. 6 refs., 6 figs., 9 tabs

  14. A Research on Uncertainty Evaluation in Verification and Calibration on LSC facility

    Lee, Seung-Jin; Park, Eung-Seop; Kim, Hee-Gang [Yeong Gwang NPP Supervisory Center for Environment Radiation and Safety, Yeonggwang (Korea, Republic of); Han, Sang-Jun [Chosun Univ., Gwangju (Korea, Republic of)

    2007-10-15

    Compared with environmental sample existing around Nuclear Power Plant, the uncertainty due to geometry difference when the calibration about Liquid Scintillation Counter using the solid H-3 Standard Source of 200,000 DPM(Disintegration Per Minute) is executed exists. Therefore, this paper intends to investigate the root cause of uncertainty due to geometry difference using Quantulus 1220 instrument and H-3 Standard source of solid and liquid form. And Teflon vial was used as a measurement cell. In this paper, it is judged that main factors which can bring about uncertainty about geometry difference are a plastic cell existing into Teflon vial and activity difference, the configuration difference of H-3 Standard Source, and evaluation on these factors are performed through experiment and measurement.

  15. A Research on Uncertainty Evaluation in Verification and Calibration on LSC facility

    Lee, Seung-Jin; Park, Eung-Seop; Kim, Hee-Gang; Han, Sang-Jun

    2007-01-01

    Compared with environmental sample existing around Nuclear Power Plant, the uncertainty due to geometry difference when the calibration about Liquid Scintillation Counter using the solid H-3 Standard Source of 200,000 DPM(Disintegration Per Minute) is executed exists. Therefore, this paper intends to investigate the root cause of uncertainty due to geometry difference using Quantulus 1220 instrument and H-3 Standard source of solid and liquid form. And Teflon vial was used as a measurement cell. In this paper, it is judged that main factors which can bring about uncertainty about geometry difference are a plastic cell existing into Teflon vial and activity difference, the configuration difference of H-3 Standard Source, and evaluation on these factors are performed through experiment and measurement

  16. Application of Interval Arithmetic in the Evaluation of Transfer Capabilities by Considering the Sources of Uncertainty

    Prabha Umapathy

    2009-01-01

    Full Text Available Total transfer capability (TTC is an important index in a power system with large volume of inter-area power exchanges. This paper proposes a novel technique to determine the TTC and its confidence intervals in the system by considering the uncertainties in the load and line parameters. The optimal power flow (OPF method is used to obtain the TTC. Variations in the load and line parameters are incorporated using the interval arithmetic (IA method. The IEEE 30 bus test system is used to illustrate the proposed methodology. Various uncertainties in the line, load and both line and load are incorporated in the evaluation of total transfer capability. From the results, it is observed that the solutions obtained through the proposed method provide much wider information in terms of closed interval form which is more useful in ensuring secured operation of the interconnected system in the presence of uncertainties in load and line parameters.

  17. Epistemic Uncertainty in Evaluation of Evapotranspiration and Net Infiltration Using Analogue Meteorological Data

    B. Faybishenko

    2006-01-01

    Uncertainty is typically defined as a potential deficiency in the modeling of a physical process, owing to a lack of knowledge. Uncertainty can be categorized as aleatoric (inherent uncertainty caused by the intrinsic randomness of the system) or epistemic (uncertainty caused by using various model simplifications and their parameters). One of the main reasons for model simplifications is a limited amount of meteorological data. This paper is devoted to the epistemic uncertainty quantification involved in two components of the hydrologic balance-evapotranspiration and net infiltration for interglacial (present day), and future monsoon, glacial transition, and glacial climates at Yucca Mountain, using the data from analogue meteorological stations. In particular, the author analyzes semi-empirical models used for evaluating (1) reference-surface potential evapotranspiration, including temperature-based models (Hargreaves-Samani, Thornthwaite, Hamon, Jensen-Haise, and Turc) and radiation-based models (Priestly-Taylor and Penman), and (2) surface-dependent potential evapotranspiration (Penman-Monteith and Shuttleworth-Wallace models). Evapotranspiration predictions are then used as inputs for the evaluation of net infiltration using the semi-empirical models of Budyko, Fu, Milly, Turc-Pike, and Zhang. Results show that net infiltration ranges are expected to generally increase from the present-day climate to monsoon climate, to glacial transition climate, and then to the glacial climate. The propagation of uncertainties through model predictions for different climates is characterized using statistical measures. Predicted evapotranspiration ranges are reasonably corroborated against the data from Class A pan evaporometers (taking into account evaporation-pan adjustment coefficients), and ranges of net infiltration predictions are corroborated against the geochemical and temperature-based estimates of groundwater recharge and percolation rates through the unsaturated

  18. Assessment of global phase uncertainty in case-control studies

    van Houwelingen Hans C

    2009-09-01

    Full Text Available Abstract Background In haplotype-based candidate gene studies a problem is that the genotype data are unphased, which results in haplotype ambiguity. The measure 1 quantifies haplotype predictability from genotype data. It is computed for each individual haplotype, and for a measure of global relative efficiency a minimum value is suggested. Alternatively, we developed methods directly based on the information content of haplotype frequency estimates to obtain global relative efficiency measures: and based on A- and D-optimality, respectively. All three methods are designed for single populations; they can be applied in cases only, controls only or the whole data. Therefore they are not necessarily optimal for haplotype testing in case-control studies. Results A new global relative efficiency measure was derived to maximize power of a simple test statistic that compares haplotype frequencies in cases and controls. Application to real data showed that our proposed method gave a clear and summarizing measure for the case-control study conducted. Additionally this measure might be used for selection of individuals, who have the highest potential for improving power by resolving phase ambiguity. Conclusion Instead of using relative efficiency measure for cases only, controls only or their combined data, we link uncertainty measure to case-control studies directly. Hence, our global efficiency measure might be useful to assess whether data are informative or have enough power for estimation of a specific haplotype risk.

  19. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique

  20. Characterizing interspecies uncertainty using data from studies of anti-neoplastic agents in animals and humans

    Price, Paul S.; Keenan, Russell E.; Swartout, Jeffrey C.

    2008-01-01

    For most chemicals, the Reference Dose (RfD) is based on data from animal testing. The uncertainty introduced by the use of animal models has been termed interspecies uncertainty. The magnitude of the differences between the toxicity of a chemical in humans and test animals and its uncertainty can be investigated by evaluating the inter-chemical variation in the ratios of the doses associated with similar toxicological endpoints in test animals and humans. This study performs such an evaluation on a data set of 64 anti-neoplastic drugs. The data set provides matched responses in humans and four species of test animals: mice, rats, monkeys, and dogs. While the data have a number of limitations, the data show that when the drugs are evaluated on a body weight basis: 1) toxicity generally increases with a species' body weight; however, humans are not always more sensitive than test animals; 2) the animal to human dose ratios were less than 10 for most, but not all, drugs; 3) the current practice of using data from multiple species when setting RfDs lowers the probability of having a large value for the ratio. These findings provide insight into inter-chemical variation in animal to human extrapolations and suggest the need for additional collection and analysis of matched toxicity data in humans and test animals

  1. Evaluating Uncertainty of Runoff Simulation using SWAT model of the Feilaixia Watershed in China Based on the GLUE Method

    Chen, X.; Huang, G.

    2017-12-01

    In recent years, distributed hydrological models have been widely used in storm water management, water resources protection and so on. Therefore, how to evaluate the uncertainty of the model reasonably and efficiently becomes a hot topic today. In this paper, the soil and water assessment tool (SWAT) model is constructed for the study area of China's Feilaixia watershed, and the uncertainty of the runoff simulation is analyzed by GLUE method deeply. Taking the initial parameter range of GLUE method as the research core, the influence of different initial parameter ranges on model uncertainty is studied. In this paper, two sets of parameter ranges are chosen as the object of study, the first one (range 1) is recommended by SWAT-CUP and the second one (range 2) is calibrated by SUFI-2. The results showed that under the same number of simulations (10,000 times), the overall uncertainty obtained by the range 2 is less than the range 1. Specifically, the "behavioral" parameter sets for the range 2 is 10000 and for the range 1 is 4448. In the calibration and the validation, the ratio of P-factor to R-factor for range 1 is 1.387 and 1.391, and for range 2 is 1.405 and 1.462 respectively. In addition, the simulation result of range 2 is better with the NS and R2 slightly higher than range 1. Therefore, it can be concluded that using the parameter range calibrated by SUFI-2 as the initial parameter range for the GLUE is a way to effectively capture and evaluate the simulation uncertainty.

  2. MO-C-17A-13: Uncertainty Evaluation of CT Image Deformable Registration for H and N Cancer Adaptive Radiotherapy

    Qin, A; Yan, D [William Beaumont Hospital, Royal Oak, MI (United States)

    2014-06-15

    Purpose: To evaluate uncertainties of organ specific Deformable Image Registration (DIR) for H and N cancer Adaptive Radiation Therapy (ART). Methods: A commercial DIR evaluation tool, which includes a digital phantom library of 8 patients, and the corresponding “Ground truth Deformable Vector Field” (GT-DVF), was used in the study. Each patient in the phantom library includes the GT-DVF created from a pair of CT images acquired prior to and at the end of the treatment course. Five DIR tools, including 2 commercial tools (CMT1, CMT2), 2 in-house (IH-FFD1, IH-FFD2), and a classic DEMON algorithms, were applied on the patient images. The resulting DVF was compared to the GT-DVF voxel by voxel. Organ specific DVF uncertainty was calculated for 10 ROIs: Whole Body, Brain, Brain Stem, Cord, Lips, Mandible, Parotid, Esophagus and Submandibular Gland. Registration error-volume histogram was constructed for comparison. Results: The uncertainty is relatively small for brain stem, cord and lips, while large in parotid and submandibular gland. CMT1 achieved best overall accuracy (on whole body, mean vector error of 8 patients: 0.98±0.29 mm). For brain, mandible, parotid right, parotid left and submandibular glad, the classic Demon algorithm got the lowest uncertainty (0.49±0.09, 0.51±0.16, 0.46±0.11, 0.50±0.11 and 0.69±0.47 mm respectively). For brain stem, cord and lips, the DVF from CMT1 has the best accuracy (0.28±0.07, 0.22±0.08 and 0.27±0.12 mm respectively). All algorithms have largest right parotid uncertainty on patient #7, which has image artifact caused by tooth implantation. Conclusion: Uncertainty of deformable CT image registration highly depends on the registration algorithm, and organ specific. Large uncertainty most likely appears at the location of soft-tissue organs far from the bony structures. Among all 5 DIR methods, the classic DEMON and CMT1 seem to be the best to limit the uncertainty within 2mm for all OARs. Partially supported by

  3. Evaluation of pull production control strategies under uncertainty: An integrated fuzzy AHP-TOPSIS approach

    Aydin Torkabadi

    2018-03-01

    Full Text Available Purpose: Just-In-Time (JIT production has continuously been considered by industrial practitioners and researchers as a leading strategy for the yet popular Lean production. Pull Production Control Policies (PPCPs are the major enablers of JIT that locally control the level of inventory by authorizing the production in each station. Aiming to improve the PPCPs, three authorization mechanisms: Kanban, constant-work-in-process (ConWIP, and a hybrid system, are evaluated by considering uncertainty. Design/methodology/approach: Multi-Criteria Decision Making (MCDM methods are successful in evaluating alternatives with respect to several objectives. The proposed approach of this study applies the fuzzy set theory together with an integrated Analytical Hierarchy Process (AHP and a Technique for Order Performance by Similarity to Ideal Solution (TOPSIS method. Findings: The study finds that hybrid Kanban-ConWIP pull production control policies have a better performance in controlling the studied multi-layer multi-stage manufacturing and assembly system. Practical implications: To examine the approach a real case from automobile electro mechanical part production industry is studied. The production system consists of multiple levels of manufacturing, feeding a multi-stage assembly line with stochastic processing times to satisfy the changing demand. Originality/value: This study proposes the integrated Kanban-ConWIP hybrid pull control policies and implements several alternatives on a multi-stage and multi-layer manufacturing and assembly production system. An integrated Fuzzy AHP TOPSIS method is developed to evaluate the alternatives with respect to several JIT criteria.

  4. Assessing Uncertainties in Gridded Emissions: A Case Study for Fossil Fuel Carbon Dioxide (FFCO2) Emission Data

    Oda, T.; Ott, L.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M.; Baker, D. F.; Pawson, S.

    2017-01-01

    Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar

  5. Propagation of uncertainties for an evaluation of the Azores-Gibraltar Fracture Zone tsunamigenic potential

    Antoshchenkova, Ekaterina; Imbert, David; Richet, Yann; Bardet, Lise; Duluc, Claire-Marie; Rebour, Vincent; Gailler, Audrey; Hébert, Hélène

    2016-04-01

    The aim of this study is to assess evaluation the tsunamigenic potential of the Azores-Gibraltar Fracture Zone (AGFZ). This work is part of the French project TANDEM (Tsunamis in the Atlantic and English ChaNnel: Definition of the Effects through numerical Modeling; www-tandem.cea.fr), special attention is paid to French Atlantic coasts. Structurally, the AGFZ region is complex and not well understood. However, a lot of its faults produce earthquakes with significant vertical slip, of a type that can result in tsunami. We use the major tsunami event of the AGFZ on purpose to have a regional estimation of the tsunamigenic potential of this zone. The major reported event for this zone is the 1755 Lisbon event. There are large uncertainties concerning source location and focal mechanism of this earthquake. Hence, simple deterministic approach is not sufficient to cover on the one side the whole AGFZ with its geological complexity and on the other side the lack of information concerning the 1755 Lisbon tsunami. A parametric modeling environment Promethée (promethee.irsn.org/doku.php) was coupled to tsunami simulation software based on shallow water equations with the aim of propagation of uncertainties. Such a statistic point of view allows us to work with multiple hypotheses simultaneously. In our work we introduce the seismic source parameters in a form of distributions, thus giving a data base of thousands of tsunami scenarios and tsunami wave height distributions. Exploring our tsunami scenarios data base we present preliminary results for France. Tsunami wave heights (within one standard deviation of the mean) can be about 0.5 m - 1 m for the Atlantic coast and approaching 0.3 m for the English Channel.

  6. Evaluation of the uncertainty in an EBT3 film dosimetry system utilizing net optical density.

    Marroquin, Elsa Y León; Herrera González, José A; Camacho López, Miguel A; Barajas, José E Villarreal; García-Garduño, Olivia A

    2016-09-08

    Radiochromic film has become an important tool to verify dose distributions for intensity-modulated radiotherapy (IMRT) and quality assurance (QA) procedures. A new radiochromic film model, EBT3, has recently become available, whose composition and thickness of the sensitive layer are the same as those of previous EBT2 films. However, a matte polyester layer was added to EBT3 to prevent the formation of Newton's rings. Furthermore, the symmetrical design of EBT3 allows the user to eliminate side-orientation dependence. This film and the flatbed scanner, Epson Perfection V750, form a dosimetry system whose intrinsic characteristics were studied in this work. In addition, uncertainties associated with these intrinsic characteristics and the total uncertainty of the dosimetry system were determined. The analysis of the response of the radiochromic film (net optical density) and the fitting of the experimental data to a potential function yielded an uncertainty of 2.6%, 4.3%, and 4.1% for the red, green, and blue channels, respectively. In this work, the dosimetry system presents an uncertainty in resolving the dose of 1.8% for doses greater than 0.8 Gy and less than 6 Gy for red channel. The films irradiated between 0 and 120 Gy show differences in the response when scanned in portrait or landscape mode; less uncertainty was found when using the portrait mode. The response of the film depended on the position on the bed of the scanner, contributing an uncertainty of 2% for the red, 3% for the green, and 4.5% for the blue when placing the film around the center of the bed of scanner. Furthermore, the uniformity and reproducibility radiochromic film and reproducibility of the response of the scanner contribute less than 1% to the overall uncertainty in dose. Finally, the total dose uncertainty was 3.2%, 4.9%, and 5.2% for red, green, and blue channels, respectively. The above uncertainty values were obtained by mini-mizing the contribution to the total dose uncertainty

  7. Evaluation of the uncertainty in an EBT3 film dosimetry system utilizing net optical density

    Marroquin, Elsa Y. León; Herrera González, José A.; Camacho López, Miguel A.; Barajas, José E. Villarreal

    2016-01-01

    Radiochromic film has become an important tool to verify dose distributions for intensity‐modulated radiotherapy (IMRT) and quality assurance (QA) procedures. A new radiochromic film model, EBT3, has recently become available, whose composition and thickness of the sensitive layer are the same as those of previous EBT2 films. However, a matte polyester layer was added to EBT3 to prevent the formation of Newton's rings. Furthermore, the symmetrical design of EBT3 allows the user to eliminate side‐orientation dependence. This film and the flatbed scanner, Epson Perfection V750, form a dosimetry system whose intrinsic characteristics were studied in this work. In addition, uncertainties associated with these intrinsic characteristics and the total uncertainty of the dosimetry system were determined. The analysis of the response of the radiochromic film (net optical density) and the fitting of the experimental data to a potential function yielded an uncertainty of 2.6%, 4.3%, and 4.1% for the red, green, and blue channels, respectively. In this work, the dosimetry system presents an uncertainty in resolving the dose of 1.8% for doses greater than 0.8 Gy and less than 6 Gy for red channel. The films irradiated between 0 and 120 Gy show differences in the response when scanned in portrait or landscape mode; less uncertainty was found when using the portrait mode. The response of the film depended on the position on the bed of the scanner, contributing an uncertainty of 2% for the red, 3% for the green, and 4.5% for the blue when placing the film around the center of the bed of scanner. Furthermore, the uniformity and reproducibility radiochromic film and reproducibility of the response of the scanner contribute less than 1% to the overall uncertainty in dose. Finally, the total dose uncertainty was 3.2%, 4.9%, and 5.2% for red, green, and blue channels, respectively. The above uncertainty values were obtained by minimizing the contribution to the total dose

  8. Invited Article: Concepts and tools for the evaluation of measurement uncertainty

    Possolo, Antonio; Iyer, Hari K.

    2017-01-01

    Measurements involve comparisons of measured values with reference values traceable to measurement standards and are made to support decision-making. While the conventional definition of measurement focuses on quantitative properties (including ordinal properties), we adopt a broader view and entertain the possibility of regarding qualitative properties also as legitimate targets for measurement. A measurement result comprises the following: (i) a value that has been assigned to a property based on information derived from an experiment or computation, possibly also including information derived from other sources, and (ii) a characterization of the margin of doubt that remains about the true value of the property after taking that information into account. Measurement uncertainty is this margin of doubt, and it can be characterized by a probability distribution on the set of possible values of the property of interest. Mathematical or statistical models enable the quantification of measurement uncertainty and underlie the varied collection of methods available for uncertainty evaluation. Some of these methods have been in use for over a century (for example, as introduced by Gauss for the combination of mutually inconsistent observations or for the propagation of "errors"), while others are of fairly recent vintage (for example, Monte Carlo methods including those that involve Markov Chain Monte Carlo sampling). This contribution reviews the concepts, models, methods, and computations that are commonly used for the evaluation of measurement uncertainty, and illustrates their application in realistic examples drawn from multiple areas of science and technology, aiming to serve as a general, widely accessible reference.

  9. Evaluation of cross-section uncertainties using physical constraints for 238U, 239Pu

    De Saint Jean, Cyrille; Privas, Edwin; Archier, Pascal; Noguere, Gilles; Litaize, Olivier; Leconte, Pierre; Bernard, David

    2014-01-01

    Neutron-induced reactions between 0 eV and 20 MeV are based on various physical properties such as nuclear reaction models, microscopic and integral measurements. Most of the time, the evaluation work is done independently between the resolved resonance range and the continuum, giving rise to mismatches for the cross-sections, larger uncertainties on boundary and no cross-correlation between high-energy domain and resonance range. In addition the use of integral experiment is sometimes only related to central values (evaluation is 'working fine' on a dedicated set of benchmarks) and reductions of uncertainties are not straightforward on cross-sections themselves: working fine could be mathematically reflected by a reduced uncertainty. As the CIELO initiative is to bring experts in each field to propose/discuss these matters, after having presented the status of 238 U and 239 Pu cross-sections covariances evaluation (for JEFF-3.2 as well as the WPEC SG34 subgroup), this paper will present several methodologies that may be used to avoid such effects on covariances. A first idea based on the use of experiments overlapping two energy domains appeared in the near past. It was reviewed and extended to the use of systematic uncertainties (normalisation for example) and for integral experiments as well. In addition, we propose a methodology taking into account physical constraints on an overlapping energy domain where both nuclear reaction models are used (continuity of both cross-sections and derivatives for example). The use of Lagrange multiplier (related to these constraints) in a classical generalised least square procedure will be exposed. Some academic examples will then be presented for both point-wise and multi-group cross-sections to present the methodologies. In addition, new results for 239 Pu will be presented on resonance range and higher energies to reduce capture and fission cross-section uncertainties by using integral experiments (JEZEBEL experiment as

  10. Impact of geometric uncertainties on evaluation of treatment techniques for prostate cancer

    Craig, Tim; Wong, Eugene; Bauman, Glenn; Battista, Jerry; Van Dyk, Jake

    2005-01-01

    Purpose: To assess the impact of patient repositioning and internal organ motion on prostate treatment plans using three-dimensional conformal and intensity-modulated radiotherapy. Methods and materials: Four-field, six-field, and simplified intensity-modulated arc therapy plans were generated for 5 prostate cancer patients. The planning target volume was created by adding a 1-cm margin to the clinical target volume. A convolution model was used to estimate the effect of random geometric uncertainties during treatment. Dose statistics, tumor control probabilities, and normal tissue complication probabilities were compared with and without the presence of uncertainty. The impact of systematic uncertainties was also investigated. Results: Compared with the planned treatments, the delivered dose distribution with random geometric uncertainties displayed an increase in the apparent minimal dose to the prostate and seminal vesicles and a decrease in the rectal volume receiving a high dose. This increased the tumor control probabilities and decreased the normal tissue complication probabilities. Changes were seen in the percentage of prostate volume receiving 100% and 95% of the prescribed dose, and the minimal dose and tumor control probabilities for the target volume. In addition, the volume receiving at least 65 Gy, the minimal dose, and normal tissue complication probabilities changed considerably for the rectum. The simplified intensity-modulated arc therapy technique was the most sensitive to systematic errors, especially in the anterior-posterior and superior-inferior directions. Conclusion: Geometric uncertainties should be considered when evaluating treatment plans. Contrary to the widely held belief, increased conformation of the dose distribution is not always associated with increased sensitivity to random geometric uncertainties if a sufficient planning target volume margin is used. Systematic errors may have a variable effect, depending on the treatment

  11. Steel bridges structural health monitoring based on operational modal analysis accommodating evaluation of uncertainty

    Saeid Jahan

    2017-11-01

    Full Text Available Structural damage detection is based on that the dynamic response of structure will change because of damage. Hence, it is possible to estimate the location and severity of damage leads to changes in the dynamic response before and after the damage. In this study, the genetic fuzzy system has been used for bridge structural health monitoring. A key objective of using genetic algorithms is to automate the design of fuzzy systems. This method is used for damage detection of a single span railway bridge with steel girders and a concrete bridge. For studying damage detection, the numerical models of these two bridges are built with the measured dynamic characteristics. A three-dimensional finite element model and a single two-dimensional girders model of the bridge have been constructed to study usefulness of the genetic fuzzy system for damage detection and the effectiveness of modeling. After analysis to control the uncertainties, the measured frequencies are contaminated with some noise and the effect of that on the achievement of damage detection method is evaluated. The present study has shown that the natural frequency has appropriate sensitivity to different damage scenarios in the structure. In addition, the natural frequency in comparison with other modal parameters, is less affected by random noise. Increasing the number of measurement modes and using torsional modes, will lead to an accurate damage diagnosis even in symmetrical structures.

  12. Information Synthesis in Uncertainty Studies: Application to the Analysis of the BEMUSE Results

    Baccou, J.; Chojnacki, E.; Destercke, S.

    2013-01-01

    mathematical framework, the more time consuming the propagation should be. Therefore, the key point is here to construct a numerical treatment for uncertainty propagation which reduces the computational cost and can be applied to complex models used in practice. In nuclear safety studies, different uncertainty analyses using different codes and implying different experts are generally performed. Deriving benefits from these analyses appears to be a problem of information synthesis which is the third key issue. Indeed each uncertainty study can be viewed as an information source on quantities of interest. It is then useful to define formal methods to combine all these information sources in order to improve the reliability of the results and to detect possible conflicts (if any) between the sources. The efficiency of an uncertainty analysis requires a reliable quantification of the information associated to uncertainty sources. This quantification is addressed in the fourth key issue. It consists in exploiting the information related to available experiments and to the comparison code/experiment to infer the uncertainty attached to the code input parameters. Therefore, the crucial points stand in the choice of an experimental database sufficiently representative and exhaustive of the considered phenomenon and in the construction of an efficient treatment to perform this inference. The two first points have been deeply studied in the frame of the OECD BEMUSE Program. In particular, it came out that statistical approaches, based on Monte-Carlo techniques, are now sufficiently robust for the evaluation of uncertainty on a LB-LOCA transient. In this paper, we focus on the third issue and present some recent developments proposed by IRSN to derive formal tools in order to improve the reliability of an analysis involving different information sources. It is applied to exhibit some important conclusions from the two BEMUSE benchmarks. For sake of completeness, we recall that the last

  13. Nuclear data requirements for the ADS conceptual design EFIT: Uncertainty and sensitivity study

    Garcia-Herranz, N., E-mail: nuria@din.upm.e [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid (Spain); Instituto de Fusion Nuclear, Universidad Politecnica de Madrid (Spain); Cabellos, O. [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid (Spain); Instituto de Fusion Nuclear, Universidad Politecnica de Madrid (Spain); Alvarez-Velarde, F. [CIEMAT (Spain); Sanz, J. [Instituto de Fusion Nuclear, Universidad Politecnica de Madrid (Spain); Departamento de Ingenieria Energetica, UNED (Spain); Gonzalez-Romero, E.M. [CIEMAT (Spain); Juan, J. [Laboratorio de Estadistica, Universidad Politecnica de Madrid (Spain)

    2010-11-15

    In this paper, we assess the impact of activation cross-section uncertainties on relevant fuel cycle parameters for a conceptual design of a modular European Facility for Industrial Transmutation (EFIT) with a 'double strata' fuel cycle. Next, the nuclear data requirements are evaluated so that the parameters can meet the assigned design target accuracies. Different discharge burn-up levels are considered: a low burn-up, corresponding to the equilibrium cycle, and a high burn-up level, simulating the effects on the fuel of the multi-recycling scenario. In order to perform this study, we propose a methodology in two steps. Firstly, we compute the uncertainties on the system parameters by using a Monte Carlo simulation, as it is considered the most reliable approach to address this problem. Secondly, the analysis of the results is performed by a sensitivity technique, in order to identify the relevant reaction channels and prioritize the data improvement needs. Cross-section uncertainties are taken from the EAF-2007/UN library since it includes data for all the actinides potentially present in the irradiated fuel. Relevant uncertainties in some of the fuel cycle parameters have been obtained, and we conclude with recommendations for future nuclear data measurement programs, beyond the specific results obtained with the present nuclear data files and the limited available covariance information. A comparison with the uncertainty and accuracy analysis recently published by the WPEC-Subgroup26 of the OECD using BOLNA covariance matrices is performed. Despite the differences in the transmuter reactor used for the analysis, some conclusions obtained by Subgroup26 are qualitatively corroborated, and improvements for additional cross sections are suggested.

  14. Lived Experiences of "Illness Uncertainty" of Iranian Cancer Patients: A Phenomenological Hermeneutic Study.

    Sajjadi, Moosa; Rassouli, Maryam; Abbaszadeh, Abbas; Brant, Jeannine; Majd, Hamid Alavi

    2016-01-01

    For cancer patients, uncertainty is a pervasive experience and a major psychological stressor that affects many aspects of their lives. Uncertainty is a multifaceted concept, and its understanding for patients depends on many factors, including factors associated with various sociocultural contexts. Unfortunately, little is known about the concept of uncertainty in Iranian society and culture. This study aimed to clarify the concept and explain lived experiences of illness uncertainty in Iranian cancer patients. In this hermeneutic phenomenological study, 8 cancer patients participated in semistructured in-depth interviews about their experiences of uncertainty in illness. Interviews continued until data saturation was reached. All interviews were recorded, transcribed, analyzed, and interpreted using 6 stages of the van Manen phenomenological approach. Seven main themes emerged from patients' experiences of illness uncertainty of cancer. Four themes contributed to uncertainty including "Complexity of Cancer," "Confusion About Cancer," "Contradictory Information," and "Unknown Future." Two themes facilitated coping with uncertainty including "Seeking Knowledge" and "Need for Spiritual Peace." One theme, "Knowledge Ambivalence," revealed the struggle between wanting to know and not wanting to know, especially if bad news was delivered. Uncertainty experience for cancer patients in different societies is largely similar. However, some experiences (eg, ambiguity in access to medical resources) seemed unique to Iranian patients. This study provided an outlook of cancer patients' experiences of illness uncertainty in Iran. Cancer patients' coping ability to deal with uncertainty can be improved.

  15. Evaluation the sources of uncertainty associated to the measurement results of in vivo monitoring of iodine 131 in the thyroid

    Gontijo, Rodrigo Modesto Gadelha

    2011-01-01

    In vivo monitoring techniques consist of identification and quantification of radionuclides present in the whole body and specific organs and tissues. In Vivo monitoring requires the use of detedors which are sensitive to the radiation emitted by radionuclides present in the monitored individual. The results obtained in measurements may present small uncertainties which are within pre-set limits in monitoring programs for occupationally exposed individuais. However, any device used to determine physical quantities present uncertainties in the measured values. The total uncertainty of a measurement result is estimated from the propagation of the uncertainties associated to each parameter of the calculation. This study aims to evaluate the sources of uncertainty associated to the measurement results of in vivo monitoring of iodine 131 in the thyroid, in comparison to the suggested in the General Guide for Estimating Effective Doses from Monitoring Data (Project IDEAS/European Community). The reference values used were the ones for high-energy photons (>100 keV). The measurement uncertainties were divided into two categories: type A and type B. The component of type A represents the statistical fluctuation in the counting of the standard source. Regarding type B, the following variations were presented: detector positioning over the phantom; variation of background radiation; thickness of the overlay tissue over the monitored organ, distribution of the activity in the organ. Besides the parameters suggested by the IDEAS Guide, it has also been evaluated the fluctuation of the counting due to the phantom repositioning, which represents the reproducibility of the measurement geometry. Measurements were performed at the Whole Body Counter Unit of IRD using a scintillation detector Nal (Tl) 3 x3 and a neck-thyroid phantom developed at LABMIVIRD. Scattering factors were calculated and compared in different counting geometries. The results of this study show that the

  16. An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.

    Aven, Terje; Renn, Ortwin

    2015-04-01

    Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.

  17. On the evaluation of uncertainties for state estimation with the Kalman filter

    Eichstädt, S; Makarava, N; Elster, C

    2016-01-01

    The Kalman filter is an established tool for the analysis of dynamic systems with normally distributed noise, and it has been successfully applied in numerous areas. It provides sequentially calculated estimates of the system states along with a corresponding covariance matrix. For nonlinear systems, the extended Kalman filter is often used. This is derived from the Kalman filter by linearization around the current estimate. A key issue in metrology is the evaluation of the uncertainty associated with the Kalman filter state estimates. The ‘Guide to the Expression of Uncertainty in Measurement’ (GUM) and its supplements serve as the de facto standard for uncertainty evaluation in metrology. We explore the relationship between the covariance matrix produced by the Kalman filter and a GUM-compliant uncertainty analysis. In addition, the results of a Bayesian analysis are considered. For the case of linear systems with known system matrices, we show that all three approaches are compatible. When the system matrices are not precisely known, however, or when the system is nonlinear, this equivalence breaks down and different results can then be reached. For precisely known nonlinear systems, though, the result of the extended Kalman filter still corresponds to the linearized uncertainty propagation of the GUM. The extended Kalman filter can suffer from linearization and convergence errors. These disadvantages can be avoided to some extent by applying Monte Carlo procedures, and we propose such a method which is GUM-compliant and can also be applied online during the estimation. We illustrate all procedures in terms of a 2D dynamic system and compare the results with those obtained by particle filtering, which has been proposed for the approximate calculation of a Bayesian solution. Finally, we give some recommendations based on our findings. (paper)

  18. Evaluation of uncertainty in the measurement of sense of natural language constructions

    Bisikalo Oleg V.

    2017-01-01

    Full Text Available The task of evaluating uncertainty in the measurement of sense in natural language constructions (NLCs was researched through formalization of the notions of the language image, formalization of artificial cognitive systems (ACSs and the formalization of units of meaning. The method for measuring the sense of natural language constructions incorporated fuzzy relations of meaning, which ensures that information about the links between lemmas of the text is taken into account, permitting the evaluation of two types of measurement uncertainty of sense characteristics. Using developed applications programs, experiments were conducted to investigate the proposed method to tackle the identification of informative characteristics of text. The experiments resulted in dependencies of parameters being obtained in order to utilise the Pareto distribution law to define relations between lemmas, analysis of which permits the identification of exponents of an average number of connections of the language image as the most informative characteristics of text.

  19. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.

  20. Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system

    Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon

    2000-06-01

    MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code

  1. Simple way to avoid underestimating uncertainties of the evaluated values for sets of consistent data: a proposal for an improvement of the evaluations

    Chechev, V.P.

    2001-01-01

    To avoid underestimating the uncertainty of the evaluated values for sets of consistent data the following rule is proposed: if the smallest of the input measurement uncertainties (σ min ) is more than the uncertainty obtained from statistical data processing, the σ min should be used as a final uncertainty of the evaluated value. This rule is justified by the fact that almost any measurement is indirect and the total uncertainty of any precise measurement includes mainly the systematic error of the measurement method. Exceptions can be only for measured data obtained by essentially different methods (for example, half life measurements by calorimetry and specific activity determination)

  2. Evaluation of seismic induced CDF and ΔCDF with considering the uncertainty reduction research results

    Hahm, Daegi; Choi, In Kil

    2012-01-01

    In the seismic probabilistic safety assessment (SPSA) of nuclear power plants (NPPs), the efficient and rational methodology to dealing the uncertainty factors are required to increase the reliability of the SPSA results. To reduce the uncertainties in the SPSA approach, many research activities were performed by Korea Atomic Energy Research Institute (KAERI) during the last 5 years mid and long term nuclear research and development program of the ministry of education, science and technology. These outcomes can be implemented to the update or reevaluation of previous NPP's SPSA results. In this study, we applied these uncertainty reduction research results to the update of the SPSA procedure of the target reference plant, i.e., Ulchin unit 5/6 NPP. The refined topics from the SPSA procedure are the seismic fragility, the seismic hazard, and the risk quantification. The detailed process and results are described in the next sections

  3. Evaluation of uncertainties treatment of DNBR calculation for Angra-1 reactor core

    Pontedeiro, A.C.; Galetti, M.R.S.

    1986-01-01

    The results of DNBR sensitivity analysis for NPP Angra 1 are presented in this report. Sensitivity study was carried out using computer code COBRAIIIP and all the sensitivity factors were calculated for the nominal condition as the reference case. These sensitivity factors were used according to the Westinghouse methodology 'Improved Thermal Design Procedure', to calculate a statistical uncertainty factor. In this methodology the best estimate DNBR is penalized by the uncertainty factor and compared with a statistical limit to the minimum DNBR. Westinghouse has been using this statistical uncertainty treatment in the core thermal design to get a better operation flexibility of the plant in order to keep the same design basis established in Angra 1 FSAR methodology. (Author) [pt

  4. Analysis of aerosol optical depth evaluation in polar regions and associated uncertainties

    P. Ortiz de Galisteo

    2008-04-01

    Full Text Available Some available processing algorithms used to calculate the aerosol optical depth from radiometric measurements were tested. The aim was to evaluate the associated uncertainties in polar regions due to the data processing, in order to adjust the methodology of the calculation and illustrate the importance of these error sources. The measurements were obtained during a sun photometer campaign in Ny-Ålesund within the framework of the POLAR-AOD project.

  5. Survey of radiofrequency radiation levels around GSM base stations and evaluation of measurement uncertainty

    Vulević Branislav D.

    2011-01-01

    Full Text Available This paper is a summary of broadband measurement values of radiofrequency radiation around GSM base stations in the vicinity of residential areas in Belgrade and 12 other cities in Serbia. It will be useful for determining non-ionizing radiation exposure levels of the general public in the future. The purpose of this paper is also an appropriate representation of basic information on the evaluation of measurement uncertainty.

  6. Evaluation of uncertainties in irradiated hardware characterization: Final report, September 30, 1986-March 31, 1987

    Bedore, N.; Levin, A.; Tuite, P.

    1987-10-01

    Waste Management Group, Inc. has evaluated the techniques used by industry to characterize and classify irradiated hardware components for disposal. This report describes the current practices used to characterize the radionuclide content of hardware components, identifies the uncertainties associated with the techniques and practices considered, and recommends areas for improvement which could reduce uncertainty. Industry uses two different characterization methods. The first uses a combination of gamma scanning, direct sampling, underwater radiation profiling and radiochemical analysis to determine radionuclide content, while the second uses a form of activation analysis in conjunction with underwater radiation profiling. Both methods employ the determination of Cobalt 60 content, and the determination of scaling factors for hard-to-detect Part 61 radionuclides. The accurate determination of Cobalt-60 is critical since the Part 61 activation product radionuclides which affect Part 61 classification are scaled from Cobalt-60. Current uncertainties in Cobalt-60 determination can be reduced by improving underwater radiation profiling equipment and techniques. The calculational techniques used for activation analysis can also be refined to reduce the uncertainties with Cobalt-60 determination. 33 refs., 11 figs., 10 tabs

  7. Evaluation of long-term RD and D programs in the presence of market uncertainties

    Hazelrigg, G.A. Jr.

    1982-01-01

    Long-term research, development, and demonstration (RD and D) programs such as fusion research can span several decades, progressing through a number of discrete RD and D phases. Pursuit of a technology such as fusion does not mean commitment to the entire RD and D program, but only to the next phase of RD and D. The evaluation of a long-term RD and D program must account for the decision process to continue, modify, or discontinue the program upon completion of each RD and D phase, the technological uncertainties inherent in a long-term RD and D program, and the uncertainty inherent in the future marketplace for the technology if and when it becomes available. Presented here is a methodology that does this. An application of the methodology to fusion research is included. The example application shows that the perceived economic value of fusion research is strongly dependent on market uncertainty, with increasing market uncertainty yielding greatly increased perceived value to the research effort. 7 references, 8 figures, 2 tables

  8. Major Results of the OECD BEMUSE (Best Estimate Methods; Uncertainty and Sensitivity Evaluation) Programme

    Reventos, F.

    2008-01-01

    One of the goals of computer code models of Nuclear Power Plants (NPP) is to demonstrate that these are designed to respond safely at postulated accidents. Models and codes are an approximation of the real physical behaviour occurring during a hypothetical transient and the data used to build these models are also known with certain accuracy. Therefore code predictions are uncertain. The BEMUSE programme is focussed on the application of uncertainty methodologies to large break LOCAs. The programme intends to evaluate the practicability, quality and reliability of best-estimate methods including uncertainty evaluations in applications relevant to nuclear reactor safety, to develop common understanding and to promote/facilitate their use by the regulator bodies and the industry. In order to fulfil its objectives BEMUSE is organized in to steps and six phases. The first step is devoted to the complete analysis of a LB-LOCA (L2-5) in an experimental facility (LOFT) while the second step refers to an actual Nuclear Power Plant. Both steps provide results on thermalhydraulic Best Estimate simulation as well as Uncertainty and sensitivity evaluation. At the time this paper is prepared, phases I, II and III are fully completed and the corresponding reports have been issued. Phase IV draft report is by now being reviewed while participants are working on Phase V developments. Phase VI consists in preparing the final status report which will summarizes the most relevant results of the whole programme.

  9. Uncertainty evaluation of a regional real-time system for rain-induced landslides

    Kirschbaum, Dalia; Stanley, Thomas; Yatheendradas, Soni

    2015-04-01

    A new prototype regional model and evaluation framework has been developed over Central America and the Caribbean region using satellite-based information including precipitation estimates, modeled soil moisture, topography, soils, as well as regionally available datasets such as road networks and distance to fault zones. The algorithm framework incorporates three static variables: a susceptibility map; a 24-hr rainfall triggering threshold; and an antecedent soil moisture variable threshold, which have been calibrated using historic landslide events. The thresholds are regionally heterogeneous and are based on the percentile distribution of the rainfall or antecedent moisture time series. A simple decision tree algorithm framework integrates all three variables with the rainfall and soil moisture time series and generates a landslide nowcast in real-time based on the previous 24 hours over this region. This system has been evaluated using several available landslide inventories over the Central America and Caribbean region. Spatiotemporal uncertainty and evaluation metrics of the model are presented here based on available landslides reports. This work also presents a probabilistic representation of potential landslide activity over the region which can be used to further refine and improve the real-time landslide hazard assessment system as well as better identify and characterize the uncertainties inherent in this type of regional approach. The landslide algorithm provides a flexible framework to improve hazard estimation and reduce uncertainty at any spatial and temporal scale.

  10. A multi-subject evaluation of uncertainty in anatomical landmark location on shoulder kinematic description.

    Langenderfer, Joseph E; Rullkoetter, Paul J; Mell, Amy G; Laz, Peter J

    2009-04-01

    An accurate assessment of shoulder kinematics is useful for understanding healthy normal and pathological mechanics. Small variability in identifying and locating anatomical landmarks (ALs) has potential to affect reported shoulder kinematics. The objectives of this study were to quantify the effect of landmark location variability on scapular and humeral kinematic descriptions for multiple subjects using probabilistic analysis methods, and to evaluate the consistency in results across multiple subjects. Data from 11 healthy subjects performing humeral elevation in the scapular plane were used to calculate Euler angles describing humeral and scapular kinematics. Probabilistic analyses were performed for each subject to simulate uncertainty in the locations of 13 upper-extremity ALs. For standard deviations of 4 mm in landmark location, the analysis predicted Euler angle envelopes between the 1 and 99 percentile bounds of up to 16.6 degrees . While absolute kinematics varied with the subject, the average 1-99% kinematic ranges for the motion were consistent across subjects and sensitivity factors showed no statistically significant differences between subjects. The description of humeral kinematics was most sensitive to the location of landmarks on the thorax, while landmarks on the scapula had the greatest effect on the description of scapular elevation. The findings of this study can provide a better understanding of kinematic variability, which can aid in making accurate clinical diagnoses and refining kinematic measurement techniques.

  11. Supporting qualified database for V and V and uncertainty evaluation of best-estimate system codes

    Petruzzi, A.; D'Auria, F.

    2014-01-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS- 52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QP' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering

  12. Evaluation of the combined measurement uncertainty in isotope dilution by MC-ICP-MS

    Fortunato, G.; Wunderli, S.

    2003-01-01

    The combination of metrological weighing, the measurement of isotope amount ratios by a multicollector inductively coupled plasma mass spectrometer (MC-ICP-MS) and the use of high-purity reference materials are the cornerstones to achieve improved results for the amount content of lead in wine by the reversed isotope dilution technique. Isotope dilution mass spectrometry (IDMS) and reversed IDMS have the potential to be a so-called primary method, with which close comparability and well-stated combined measurement uncertainties can be obtained. This work describes the detailed uncertainty budget determination using the ISO-GUM approach. The traces of lead in wine were separated from the matrix by ion exchange chromatography after HNO 3 /H 2 O 2 microwave digestion. The thallium isotope amount ratio (n( 205 Tl)/n( 203 Tl)) was used to correct for mass discrimination using an exponential model approach. The corrected lead isotope amount ratio n( 206 Pb)/n( 208 Pb) for the isotopic standard SRM 981 measured in our laboratory was compared with ratio values considered to be the least uncertain. The result has been compared in a so-called pilot study ''lead in wine'' organised by the CCQM (Comite Consultatif pour la Quantite de Matiere, BIPM, Paris; the highest measurement authority for analytical chemical measurements). The result for the lead amount content k(Pb) and the corresponding expanded uncertainty U given by our laboratory was:k(Pb)=1.329 x 10-10mol g-1 (amount content of lead in wine)U[k(Pb)]=1.0 x 10-12mol g-1 (expanded uncertainty U=k x uc, k=2) The uncertainty of the main influence parameter of the combined measurement uncertainty was determined to be the isotope amount ratio R 206,B of the blend between the enriched spike and the sample. (orig.)

  13. Understanding uncertainty

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  14. Managing uncertainty during r&d projects: a case study

    Wouters, Marc; Roorda, Berend; Gal, Ruud

    2011-01-01

    Firms make signifi cant investments in R&D projects, yet the economic return is often diffi cult to predict because of signifi cant technological and commercial uncertainty. We present an innovative and practical method for managing R&D projects, and we discuss its application to a large R&D

  15. Bayesian uncertainty analysis for complex systems biology models: emulation, global parameter searches and evaluation of gene functions.

    Vernon, Ian; Liu, Junli; Goldstein, Michael; Rowe, James; Topping, Jen; Lindsey, Keith

    2018-01-02

    Many mathematical models have now been employed across every area of systems biology. These models increasingly involve large numbers of unknown parameters, have complex structure which can result in substantial evaluation time relative to the needs of the analysis, and need to be compared to observed data of various forms. The correct analysis of such models usually requires a global parameter search, over a high dimensional parameter space, that incorporates and respects the most important sources of uncertainty. This can be an extremely difficult task, but it is essential for any meaningful inference or prediction to be made about any biological system. It hence represents a fundamental challenge for the whole of systems biology. Bayesian statistical methodology for the uncertainty analysis of complex models is introduced, which is designed to address the high dimensional global parameter search problem. Bayesian emulators that mimic the systems biology model but which are extremely fast to evaluate are embeded within an iterative history match: an efficient method to search high dimensional spaces within a more formal statistical setting, while incorporating major sources of uncertainty. The approach is demonstrated via application to a model of hormonal crosstalk in Arabidopsis root development, which has 32 rate parameters, for which we identify the sets of rate parameter values that lead to acceptable matches between model output and observed trend data. The multiple insights into the model's structure that this analysis provides are discussed. The methodology is applied to a second related model, and the biological consequences of the resulting comparison, including the evaluation of gene functions, are described. Bayesian uncertainty analysis for complex models using both emulators and history matching is shown to be a powerful technique that can greatly aid the study of a large class of systems biology models. It both provides insight into model behaviour

  16. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  17. Evaluation and analysis of uncertainty in the information seeking behavior of medical post-graduate students

    Azami Mohammad

    2018-02-01

    Full Text Available This study aimed to explore and analyze uncertainty in the information seeking behavior among the students of Kerman University of Medical Sciences (KUMS based on Kuhlthau Information Search Process Model. This is an applied research. Data gathered using questionnaire. Research population included 1075 students from all graduate students of KUMS in M.Sc. and Ph.D. grades. The sample size estimated 263 people .The studied students had relatively similar senses as reported by Kuhlthau in her information search process model. Among demographic variables, only gender affected the presentation stage. Women had better performance in the presentation stage. Ph.D. students performed better than master students when selecting their research topics. These two groups had no clear differences in other stages. Students with previous experience in research activities had better performance in title selection, literature exploration and presentation stages and also had lower uncertainty. The students’ performance decreased in different stages as their ages increased. The effect of individuals’ age on their performance was considerable in the stages of literature exploration and result presentation. The graduate students of KUMS follow the same stages as Kuhlthau information search process model and have similar feelings with that. Uncertainty was felt in the different stages of information search by graduate students of KUMS. The factors like age, gender, level of education and previous experience were effective in some stages on decrease or increase of uncertainty.

  18. Uncertainty Determination Methodology, Sampling Maps Generation and Trend Studies with Biomass Thermogravimetric Analysis

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    This paper investigates a method for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG analysis) for several lignocellulosic materials (ground olive stone, almond shell, pine pellets and oak pellets), completing previous work of the same authors. A comparison has been made between results of TG analysis and prompt analysis. Levels of uncertainty and errors were obtained, demonstrating that properties evaluated by TG analysis were representative of the overall fuel composition, and no correlation between prompt and TG analysis exists. Additionally, a study of trends and time correlations is indicated. These results are particularly interesting for biomass energy applications. PMID:21152292

  19. Uncertainty evaluation of data and information fusion within the context of the decision loop

    De Villiers, J Pieter

    2016-07-01

    Full Text Available . Here, the uncertainties in the combination and decision parts of the information flow are considered. The objective of this paper is to make explicit how uncertainties that arise during design, combine with uncertainties during runtime, as well...

  20. Uncertainties in environmental impact assessments due to expert opinion. Case study. Radioactive waste in Slovenia

    Kontic, B.; Ravnik, M.

    1998-01-01

    A comprehensive study was done at the J. Stefan Institute in Ljubljana and the School of Environmental Sciences in Nova Gorica in relation to sources of uncertainties in long-term environmental impact assessment (EIA). Under the research two main components were examined: first, methodology of the preparation of an EIA, and second validity of an expert opinion. Following the findings of the research a survey was performed in relation to assessing acceptability of radioactive waste repository by the regulatory. The components of dose evaluation in different time frames were examined in terms of susceptibility to uncertainty. Uncertainty associated to human exposure in the far future is so large that dose and risk, as individual numerical indicators of safety, by our opinion, should not be used in compliance assessment for radioactive waste repository. On the other hand, results of the calculations on the amount and activity of low and intermediate level waste and the spent fuel from the Krsko NPP show that expert's understanding of the treated questions can be expressed in transparent way giving credible output of the models used.(author)

  1. Centralised, decentralised or hybrid sanitation systems? Economic evaluation under urban development uncertainty and phased expansion.

    Roefs, Ivar; Meulman, Brendo; Vreeburg, Jan H G; Spiller, Marc

    2017-02-01

    Sanitation systems are built to be robust, that is, they are dimensioned to cope with population growth and other variability that occurs throughout their lifetime. It was recently shown that building sanitation systems in phases is more cost effective than one robust design. This phasing can take place by building small autonomous decentralised units that operate closer to the actual demand. Research has shown that variability and uncertainty in urban development does affect the cost effectiveness of this approach. Previous studies do not, however, consider the entire sanitation system from collection to treatment. The aim of this study is to assess the economic performance of three sanitation systems with different scales and systems characteristics under a variety of urban development pathways. Three systems are studied: (I) a centralised conventional activated sludge treatment, (II) a community on site source separation grey water and black water treatment and (III) a hybrid with grey water treatment at neighbourhood scale and black water treatment off site. A modelling approach is taken that combines a simulation of greenfield urban growth, a model of the wastewater collection and treatment infrastructure design properties and a model that translates design parameters into discounted asset lifetime costs. Monte Carlo simulations are used to evaluate the economic performance under uncertain development trends. Results show that the conventional system outperforms both of the other systems when total discounted lifetime costs are assessed, because it benefits from economies of scale. However, when population growth is lower than expected, the source-separated system is more cost effective, because of reduced idle capacity. The hybrid system is not competitive under any circumstance due to the costly double piping and treatment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Evaluation of global fine-resolution precipitation products and their uncertainty quantification in ensemble discharge simulations

    Qi, W.; Zhang, C.; Fu, G.; Sweetapple, C.; Zhou, H.

    2016-02-01

    The applicability of six fine-resolution precipitation products, including precipitation radar, infrared, microwave and gauge-based products, using different precipitation computation recipes, is evaluated using statistical and hydrological methods in northeastern China. In addition, a framework quantifying uncertainty contributions of precipitation products, hydrological models, and their interactions to uncertainties in ensemble discharges is proposed. The investigated precipitation products are Tropical Rainfall Measuring Mission (TRMM) products (TRMM3B42 and TRMM3B42RT), Global Land Data Assimilation System (GLDAS)/Noah, Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and a Global Satellite Mapping of Precipitation (GSMAP-MVK+) product. Two hydrological models of different complexities, i.e. a water and energy budget-based distributed hydrological model and a physically based semi-distributed hydrological model, are employed to investigate the influence of hydrological models on simulated discharges. Results show APHRODITE has high accuracy at a monthly scale compared with other products, and GSMAP-MVK+ shows huge advantage and is better than TRMM3B42 in relative bias (RB), Nash-Sutcliffe coefficient of efficiency (NSE), root mean square error (RMSE), correlation coefficient (CC), false alarm ratio, and critical success index. These findings could be very useful for validation, refinement, and future development of satellite-based products (e.g. NASA Global Precipitation Measurement). Although large uncertainty exists in heavy precipitation, hydrological models contribute most of the uncertainty in extreme discharges. Interactions between precipitation products and hydrological models can have the similar magnitude of contribution to discharge uncertainty as the hydrological models. A

  3. Evaluation of uncertainties in femtoampere current measurement for the number concentration standard of aerosol nanoparticles

    Sakurai, Hiromu; Ehara, Kensei

    2011-01-01

    We evaluated uncertainties in current measurement by the electrometer at the current level on the order of femtoamperes. The electrometer was the one used in the Faraday-cup aerosol electrometer of the Japanese national standard for number concentration of aerosol nanoparticles in which the accuracy of the absolute current is not required, but the net current which is obtained as the difference in currents under two different conditions must be measured accurately. The evaluation was done experimentally at the current level of 20 fA, which was much smaller than the intervals between the electrometer's calibration points at +1, +0.5, −0.5 and −1 pA. The slope of the response curve for the relationship between the 'true' and measured current, which is crucial in the above measurement, was evaluated locally at many different points within the ±1 pA range for deviation from the slope determined by a linear regression of the calibration data. The sum of the current induced by a flow of charged particles and a bias current from a current-source instrument was measured by the electrometer while the particle current was toggled on and off. The net particle current was obtained as the difference in the measured currents between the toggling, while at the same time the current was estimated from the particle concentration read by a condensation particle counter. The local slope was calculated as the ratio of the measured to estimated currents at each bias current setting. The standard deviation of the local slope values observed at varied bias currents was about 0.003, which was calculated by analysis of variance (ANOVA) for the treatment of the bias current. The combined standard uncertainty of the slope, which was calculated from the uncertainty of the slope by linear regression and the variability of the slope, was calculated to be about 0.004

  4. Effects of Uncertainties in Hydrological Modelling. A Case Study of a Mountainous Catchment in Southern Norway

    Engeland, Kolbjorn; Steinsland, Ingelin

    2016-04-01

    The aim of this study is to investigate how the inclusion of uncertainties in inputs and observed streamflow influence the parameter estimation, streamflow predictions and model evaluation. In particular we wanted to answer the following research questions: • What is the effect of including a random error in the precipitation and temperature inputs? • What is the effect of decreased information about precipitation by excluding the nearest precipitation station? • What is the effect of the uncertainty in streamflow observations? • What is the effect of reduced information about the true streamflow by using a rating curve where the measurement of the highest and lowest streamflow is excluded when estimating the rating curve? To answer these questions, we designed a set of calibration experiments and evaluation strategies. We used the elevation distributed HBV model operating on daily time steps combined with a Bayesian formulation and the MCMC routine Dream for parameter inference. The uncertainties in inputs was represented by creating ensembles of precipitation and temperature. The precipitation ensemble were created using a meta-gaussian random field approach. The temperature ensembles were created using a 3D Bayesian kriging with random sampling of the temperature laps rate. The streamflow ensembles were generated by a Bayesian multi-segment rating curve model. Precipitation and temperatures were randomly sampled for every day, whereas the streamflow ensembles were generated from rating curve ensembles, and the same rating curve was always used for the whole time series in a calibration or evaluation run. We chose a catchment with a meteorological station measuring precipitation and temperature, and a rating curve of relatively high quality. This allowed us to investigate and further test the effect of having less information on precipitation and streamflow during model calibration, predictions and evaluation. The results showed that including uncertainty

  5. Uncertainty Analysis and Overtopping Risk Evaluation of Maroon Dam withMonte Carlo and Latin Hypercube Methods

    J. M. Vali Samani

    2016-02-01

    Full Text Available Introduction: The greatest part of constructed dams belongs to embankment dams and there are many examples of their failures throughout history. About one-third of the world’s dam failures have been caused by flood overtopping, which indicates that flood overtopping is an important factor affecting reservoir projects’ safety. Moreover, because of a poor understanding of the randomness of floods, reservoir water levels during flood seasons are often lowered artificially in order to avoid overtopping and protect the lives and property of downstream residents. So, estimation of dam overtopping risk with regard to uncertainties is more important than achieving the dam’s safety. This study presents the procedure for risk evaluation of dam overtopping due to various uncertaintiess in inflows and reservoir initial condition. Materials and Methods: This study aims to present a practical approach and compare the different uncertainty analysis methods in the evaluation of dam overtopping risk due to flood. For this purpose, Monte Carlo simulation and Latin hypercube sampling methods were used to calculate the overtopping risk, evaluate the uncertainty, and calculate the highest water level during different flood events. To assess these methods from a practical point of view, the Maroon dam was chosen for the case study. Figure. 1 indicates the work procedure, including three parts: 1 Identification and evaluation of effective factors on flood routing and dam overtopping, 2 Data collection and analysis for reservoir routing and uncertainty analysis, 3 Uncertainty and risk analysis. Figure 1- Diagram of dam overtopping risk evaluation Results and Discussion: Figure 2 shows the results of the computed overtopping risks for the Maroon Dam without considering the wind effect, for the initial water level of 504 m as an example. As it is shown in Figure. 2, the trends of the risk curves computed by the different uncertainty analysis methods are similar

  6. Evaluation of machine learning algorithms for prediction of regions of high Reynolds averaged Navier Stokes uncertainty

    Ling, J.; Templeton, J.

    2015-08-01

    Reynolds Averaged Navier Stokes (RANS) models are widely used in industry to predict fluid flows, despite their acknowledged deficiencies. Not only do RANS models often produce inaccurate flow predictions, but there are very limited diagnostics available to assess RANS accuracy for a given flow configuration. If experimental or higher fidelity simulation results are not available for RANS validation, there is no reliable method to evaluate RANS accuracy. This paper explores the potential of utilizing machine learning algorithms to identify regions of high RANS uncertainty. Three different machine learning algorithms were evaluated: support vector machines, Adaboost decision trees, and random forests. The algorithms were trained on a database of canonical flow configurations for which validated direct numerical simulation or large eddy simulation results were available, and were used to classify RANS results on a point-by-point basis as having either high or low uncertainty, based on the breakdown of specific RANS modeling assumptions. Classifiers were developed for three different basic RANS eddy viscosity model assumptions: the isotropy of the eddy viscosity, the linearity of the Boussinesq hypothesis, and the non-negativity of the eddy viscosity. It is shown that these classifiers are able to generalize to flows substantially different from those on which they were trained. Feature selection techniques, model evaluation, and extrapolation detection are discussed in the context of turbulence modeling applications.

  7. The influence of uncertainties of measurements in laboratory performance evaluation by intercomparison program in radionuclide analyses of environmental samples

    Tauhata, L.; Vianna, M.E.; Oliveira, A.E. de; Clain, A.F.; Ferreira, A.C.M.; Bernardes, E.M.

    2000-01-01

    The accuracy and precision of results of the radionuclide analyses in environmental samples are widely claimed internationally due to its consequences in the decision process coupled to evaluation of environmental pollution, impact, internal and external population exposure. These characteristics of measurement of the laboratories can be shown clearly using intercomparison data, due to the existence of a reference value and the need of three determinations for each analysis. In intercomparison studies accuracy in radionuclide assays in low-level environmental samples has usually been the main focus in performance evaluation and it can be estimated by taking into account the deviation between the experimental laboratory mean value and the reference value. The laboratory repeatability of measurements or their standard deviation is seldom included in performance evaluation. In order to show the influence of the uncertainties in performance evaluation of the laboratories, data of 22 intercomparison runs which distributed 790 spiked environmental samples to 20 Brazilian participant laboratories were compared, using the 'Normalised Standard Deviation' as statistical criteria for performance evaluation of U.S.EPA. It mainly takes into account the laboratory accuracy and the performance evaluation using the same data classified by normalised standard deviation modified by a weight reactor that includes the individual laboratory uncertainty. The results show a relative decrease in laboratory performance in each radionuclide assay: 1.8% for 65 Zn, 2.8% for 40 K, 3.4 for 60 Co, 3.7% for 134 Cs, 4.0% for 137 Cs, 4.4% for Th and U nat , 4.5% for 3 H, 6.3% for 133 Ba, 8.6% for 90 Sr, 10.6% for Gross Alpha, 10.9% for 106 Ru, 11.1% for 226 Ra, 11.5% for Gross Beta and 13.6% for 228 Ra. The changes in the parameters of the statistical distribution function were negligible and the distribution remained as Gaussian type for all radionuclides analysed. Data analyses in terms of

  8. Evaluation of photonuclear reaction cross-sections using the reduction method for large systematic uncertainties

    Varlamov, V.V.; Efimkin, N.G.; Ishkhanov, B.S.; Sapunenko, V.V.

    1994-12-01

    The authors describe a method based on the reduction method for the evaluation of photonuclear reaction cross-sections obtained under conditions where there are large systematic uncertainties (different instrumental functions, calibration and normalization errors). The evaluation method involves using the actual instrumental function (photon spectrum) of each individual experiment to reduce the data to a representation generated by an instrumental function of better quality. The objective is to find the most reasonably achievable monoenergetic representation of the information on the reaction cross-section derived from the results of various experiments and to take into account the calibration and normalization errors in these experiments. The method was used to obtain the evaluated total photoneutron reaction cross-section (γ,xn) for a large number of nuclei. Data obtained for 16 O and 208 Pb are presented. (author). 36 refs, 6 figs, 4 tabs

  9. Performance and Uncertainty Evaluation of Snow Models on Snowmelt Flow Simulations over a Nordic Catchment (Mistassibi, Canada

    Magali Troin

    2015-11-01

    Full Text Available An analysis of hydrological response to a multi-model approach based on an ensemble of seven snow models (SM; degree-day and mixed degree-day/energy balance models coupled with three hydrological models (HM is presented for a snowmelt-dominated basin in Canada. The present study aims to compare the performance and the reliability of different types of SM-HM combinations at simulating snowmelt flows over the 1961–2000 historical period. The multi-model approach also allows evaluating the uncertainties associated with the structure of the SM-HM ensemble to better predict river flows in Nordic environments. The 20-year calibration shows a satisfactory performance of the ensemble of 21 SM-HM combinations at simulating daily discharges and snow water equivalents (SWEs, with low streamflow volume biases. The validation of the ensemble of 21 SM-HM combinations is conducted over a 20-year period. Performances are similar to the calibration in simulating the daily discharges and SWEs, again with low model biases for streamflow. The spring-snowmelt-generated peak flow is captured only in timing by the ensemble of 21 SM-HM combinations. The results of specific hydrologic indicators show that the uncertainty related to the choice of the given HM in the SM-HM combinations cannot be neglected in a more quantitative manner in simulating snowmelt flows. The selection of the SM plays a larger role than the choice of the SM approach (degree-day versus mixed degree-day/energy balance in simulating spring flows. Overall, the snow models provide a low degree of uncertainty to the total uncertainty in hydrological modeling for snow hydrology studies.

  10. Uncertainties and novel prospects in the study of the soil carbon dynamics

    Yang Wang; Yuch-Ping Hsieh

    2002-01-01

    Establishment of the Kyoto Protocol has resulted in an effort to look towards living biomass and soils for carbon sequestration. In order for carbon credits to be meaningful, sustained carbon sequestration for decades or longer is required. It has been speculated that improved land management could result in sequestration of a substantial amount of carbon in soils within several decades and therefore can be an important option in reducing atmospheric CO 2 concentration. However, evaluation of soil carbon sources and sinks is difficult because the dynamics of soil carbon storage and release is complex and still not well understood. There has been rapid development of quantitative techniques over the past two decades for measuring the component fluxes of the global carbon cycle and for studying the soil carbon cycle. Most significant development in the soil carbon cycle study is the application of accelerator mass spectrometry (AMS) in radiocarbon measurements. This has made it possible to unravel rates of carbon cycling in soils, by studying natural levels of radiocarbon in soil organic matter and soil CO 2 . Despite the advances in the study of the soil carbon cycle in the recent decades, tremendous uncertainties exist in the sizes and turnover times of soil carbon pools. The uncertainties result from lack of standard methods and incomplete understanding of soil organic carbon dynamics, compounded by natural variability in soil carbon and carbon isotopic content even within the same ecosystem. Many fundamental questions concerning the dynamics of the soil carbon cycle have yet to be answered. This paper reviews and synthesizes the isotopic approaches to the study of the soil carbon cycle. We will focus on uncertainties and limitations associated with these approaches and point out areas where more research is needed to improve our understanding of this important component of the global carbon cycle. (author)

  11. Evaluation of the theoretical uncertainties in the W → lν cross sections at the LHC

    Adam, Nadia E.; Halyo, Valerie; Zhu Wenhan; Yost, Scott A.

    2008-01-01

    We study the sources of systematic errors in the measurement of the W → lν cross-sections at the LHC. We consider the systematic errors in both the total cross-section and acceptance for anticipated experimental cuts. We include the best available analysis of QCD effects at NNLO in assessing the effect of higher order corrections and PDF and scale uncertainties on the theoretical acceptance. In addition, we evaluate the error due to missing NLO electroweak corrections and propose which MC generators and computational schemes should be implemented to best simulate the events.

  12. Coupling Uncertainties with Accuracy Assessment in Object-Based Slum Detections, Case Study: Jakarta, Indonesia

    Pratomo, J.; Kuffer, M.; Martinez, Javier; Kohli, D.

    2017-01-01

    Object-Based Image Analysis (OBIA) has been successfully used to map slums. In general, the occurrence of uncertainties in producing geographic data is inevitable. However, most studies concentrated solely on assessing the classification accuracy and neglecting the inherent uncertainties. Our

  13. A Comparative Study of Uncertainty Reduction Theory in High- and Low-Context Cultures.

    Kim, Myoung-Hye; Yoon, Tae-Jin

    To test the cross-cultural validity of uncertainty reduction theory, a study was conducted using students from South Korea and the United States who were chosen to represent high- and low-context cultures respectively. Uncertainty reduction theory is based upon the assumption that the primary concern of strangers upon meeting is one of uncertainty…

  14. An Evaluation of Test and Physical Uncertainty of Measuring Vibration in Wooden Junctions

    Dickow, Kristoffer Ahrens; Kirkegaard, Poul Henning; Andersen, Lars Vabbersgaard

    2012-01-01

    In the present paper a study of test and material uncertainty in modal analysis of certain wooden junctions is presented. The main structure considered here is a T-junction made from a particleboard plate connected to a spruce beam of rectangular cross section. The size of the plate is 1.2 m by 0.......6 m. The T-junctions represent cut-outs of actual full size floor assemblies. The aim of the experiments is to investigate the underlying uncertainties of both the test method as well as variation in material and craftmanship. For this purpose, ten nominally identical junctions are tested and compared...... to each other in terms of modal parameters such as natural frequencies, modeshapes and damping. Considerations regarding the measurement procedure and test setup are discussed. The results indicate a large variation of the response at modes where the coupling of torsion in the beam to bending of the plate...

  15. Measurement uncertainty evaluation of cellular spheroids surface tension in compressing tests using Young-Laplace equation

    Beatrici, Anderson; Santos Baptista, Leandra; Mauro Granjeiro, José

    2018-03-01

    Regenerative Medicine comprises the Biotechnology, Tissue Engineering and Biometrology for stem cell therapy. Starting from stem cells extracted from the patient, autologous implant, these cells are cultured and differentiated into other tissues, for example, articular cartilage. These cells are reorganized into microspheres (cell spheroids). Such tissue units are recombined into functional tissues constructs that can be implanted in the injured region for regeneration. It is necessary the biomechanical characterization of these constructed to determine if their properties are similar to native tissue. In this study was carried out the modeling of the calculation of uncertainty of the surface tension of cellular spheroids with the use of the Young-Laplace equation. We obtained relative uncertainties about 10%.

  16. Contribution of the mathematical modelling of knowledge to the evaluation of uncertainty margins of a LBLOCA transient (LOFT-L2-5)

    Baccou, J.; Chojnacki, E.

    2007-01-01

    This work is devoted to some recent developments in uncertainty analysis of the computer code responses used for accident management procedures in nuclear industry. The classical probabilistic approach to evaluate uncertainties is recalled. In this case, the statistical treatment of the code responses is based on the use of order statistics. It provides direct estimations of relevant statistical measures for safety studies. However, the lack of knowledge about uncertainty sources can deteriorate the decision-making. To respect the real state of knowledge, a second model, based on the Dempster-Shafer theory is introduced. It allows to mix the probabilistic approach with the possibility theory that is more appropriate when few information is available. An application of both methodologies to the uncertainty analysis of a LBLOCA transient (LOFT-L2-5) is given

  17. Local scale multiple quantitative risk assessment and uncertainty evaluation in a densely urbanised area (Brescia, Italy

    S. Lari

    2012-11-01

    Full Text Available The study of the interactions between natural and anthropogenic risks is necessary for quantitative risk assessment in areas affected by active natural processes, high population density and strong economic activities.

    We present a multiple quantitative risk assessment on a 420 km2 high risk area (Brescia and surroundings, Lombardy, Northern Italy, for flood, seismic and industrial accident scenarios. Expected economic annual losses are quantified for each scenario and annual exceedance probability-loss curves are calculated. Uncertainty on the input variables is propagated by means of three different methodologies: Monte-Carlo-Simulation, First Order Second Moment, and point estimate.

    Expected losses calculated by means of the three approaches show similar values for the whole study area, about 64 000 000 € for earthquakes, about 10 000 000 € for floods, and about 3000 € for industrial accidents. Locally, expected losses assume quite different values if calculated with the three different approaches, with differences up to 19%.

    The uncertainties on the expected losses and their propagation, performed with the three methods, are compared and discussed in the paper. In some cases, uncertainty reaches significant values (up to almost 50% of the expected loss. This underlines the necessity of including uncertainty in quantitative risk assessment, especially when it is used as a support for territorial planning and decision making. The method is developed thinking at a possible application at a regional-national scale, on the basis of data available in Italy over the national territory.

  18. The Findings from the OECD/NEA/CSNI UMS (Uncertainty Method Study)

    D'Auria, F.; Glaeser, H.

    2013-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a 'best estimate' concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI (Committee on the Safety of Nuclear Installations) of OECD/NEA (Organization for Economic Cooperation and Development / Nuclear Energy Agency), has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges. A 'bifurcation' analysis was also performed by the same research group also providing another way of interpreting the high temperature peak calculated by two of the participants. (authors)

  19. Application of Sensitivity and Uncertainty Analysis Methods to a Validation Study for Weapons-Grade Mixed-Oxide Fuel

    Dunn, M.E.

    2001-01-01

    At the Oak Ridge National Laboratory (ORNL), sensitivity and uncertainty (S/U) analysis methods and a Generalized Linear Least-Squares Methodology (GLLSM) have been developed to quantitatively determine the similarity or lack thereof between critical benchmark experiments and an application of interest. The S/U and GLLSM methods provide a mathematical approach, which is less judgment based relative to traditional validation procedures, to assess system similarity and estimate the calculational bias and uncertainty for an application of interest. The objective of this paper is to gain experience with the S/U and GLLSM methods by revisiting a criticality safety evaluation and associated traditional validation for the shipment of weapons-grade (WG) MOX fuel in the MO-1 transportation package. In the original validation, critical experiments were selected based on a qualitative assessment of the MO-1 and MOX contents relative to the available experiments. Subsequently, traditional trending analyses were used to estimate the Δk bias and associated uncertainty. In this paper, the S/U and GLLSM procedures are used to re-evaluate the suite of critical experiments associated with the original MO-1 evaluation. Using the S/U procedures developed at ORNL, critical experiments that are similar to the undamaged and damaged MO-1 package are identified based on sensitivity and uncertainty analyses of the criticals and the MO-1 package configurations. Based on the trending analyses developed for the S/U and GLLSM procedures, the Δk bias and uncertainty for the most reactive MO-1 package configurations are estimated and used to calculate an upper subcritical limit (USL) for the MO-1 evaluation. The calculated bias and uncertainty from the S/U and GLLSM analyses lead to a calculational USL that supports the original validation study for the MO-1

  20. Integration of renewable generation uncertainties into stochastic unit commitment considering reserve and risk: A comparative study

    Quan, Hao; Srinivasan, Dipti; Khosravi, Abbas

    2016-01-01

    The uncertainties of renewable energy have brought great challenges to power system commitment, dispatches and reserve requirement. This paper presents a comparative study on integration of renewable generation uncertainties into SCUC (stochastic security-constrained unit commitment) considering reserve and risk. Renewable forecast uncertainties are captured by a list of PIs (prediction intervals). A new scenario generation method is proposed to generate scenarios from these PIs. Different system uncertainties are considered as scenarios in the stochastic SCUC problem formulation. Two comparative simulations with single (E1: wind only) and multiple sources of uncertainty (E2: load, wind, solar and generation outages) are investigated. Five deterministic and four stochastic case studies are performed. Different generation costs, reserve strategies and associated risks are compared under various scenarios. Demonstrated results indicate the overall costs of E2 is lower than E1 due to penetration of solar power and the associated risk in deterministic cases of E2 is higher than E1. It implies the superimposed effect of uncertainties during uncertainty integration. The results also demonstrate that power systems run a higher level of risk during peak load hours, and that stochastic models are more robust than deterministic ones. - Highlights: • An extensive comparative study for renewable integration is presented. • A novel scenario generation method is proposed. • Wind and solar uncertainties are represented by a list of prediction intervals. • Unit commitment and dispatch costs are discussed considering reserve and risk.

  1. Uncertainty evaluation by gamma transmission measurements and CFD model comparison in a FCC cold pilot unit

    Dantas C.C.

    2013-01-01

    Full Text Available The solid flow in air-catalyst in circulating fluidized bed was simulated with CFD model to obtain axial and radial distribution. Therefore, project parameters were confirmed and steady state operation condition was improved. Solid holds up axial end radial profiles simulation and comparison with gamma transmission measurements are in a good agreement. The transmission signal from an 241Am radioactive source was evaluated in NaI(Tl detector coupled to multichannel analyzer. This non intrusive measuring set up is installed at riser of a cold pilot unit to determine parameters of FCC catalyst flow at several concentrations. Mass flow rate calculated by combining solid hold up and solid phase velocity measurements was compared with catalyst inlet measured at down-comer. Evaluation in each measured parameter shows that a relative combined uncertainty of 6% in a 95% interval was estimated. Uncertainty analysis took into account a significant correlation in scan riser transmission measurements. An Eulerian approach of CFD model incorporating the kinetic theory of granular flow was adopted to describe the gas–solid two-phase flows in a multizone circulating reactor. Instantaneous and local gas-particle velocity, void fraction and turbulent parameters were obtained and results are shown in 2 D and 3D graphics.

  2. Ecosystem Services Mapping Uncertainty Assessment: A Case Study in the Fitzroy Basin Mining Region

    Zhenyu Wang

    2018-01-01

    Full Text Available Ecosystem services mapping is becoming increasingly popular through the use of various readily available mapping tools, however, uncertainties in assessment outputs are commonly ignored. Uncertainties from different sources have the potential to lower the accuracy of mapping outputs and reduce their reliability for decision-making. Using a case study in an Australian mining region, this paper assessed the impact of uncertainties on the modelling of the hydrological ecosystem service, water provision. Three types of uncertainty were modelled using multiple uncertainty scenarios: (1 spatial data sources; (2 modelling scales (temporal and spatial and (3 parameterization and model selection. We found that the mapping scales can induce significant changes to the spatial pattern of outputs and annual totals of water provision. In addition, differences in parameterization using differing sources from the literature also led to obvious differences in base flow. However, the impact of each uncertainty associated with differences in spatial data sources were not so great. The results of this study demonstrate the importance of uncertainty assessment and highlight that any conclusions drawn from ecosystem services mapping, such as the impacts of mining, are likely to also be a property of the uncertainty in ecosystem services mapping methods.

  3. Theoretical evaluation of measurement uncertainties of two-color pyrometry applied to optical diagnostics

    Fu Tairan; Cheng Xiaofang; Yang Zangjian

    2008-01-01

    We present a theoretical analysis of two-color pyrometry applied to optical diagnostics. A two-color pyrometer built with a single CCD is advantageous due to the simple system design. We evaluate the possibility and degree of ill-conditionness on the basis of measurement uncertainties for different measurement approaches of this two-color system. We classify measurement approaches. The corresponding ill-conditionness criterion is established. The greater the criterion value is, the worse the ill-conditioned degree of solution is. So, the optimum choice of measurement approach for the two-color system is achieved through intercomparison of the criterion values. Numerical examples are also given to illustrate this point. The theoretical analysis not only provides an effective way of evaluating different measurement approaches, but also may help us to better understand the influences that determine the choices between wavelength/waveband measurements and calibration/noncalibration modes for temperature and soot distribution

  4. Evaluation of SR 97 regarding treatment of uncertainties in chemical systems

    Ekberg, C.

    2000-01-01

    The aim of this review is to evaluate the SKB safety report SR 97 with respect to the handling of uncertainties related to chemical modelling together with a glance at the handling of the general chemistry. The fact that it is impossible to show that all variables, processes and connections in a safety assessment have been taken into account is elementary and does not need be further addressed. However, it must be up to SKB to prove that their decisions and judgements are within reason. Conceptual uncertainties are discussed in a satisfactory, but somewhat too brief, way in the main report. Unfortunately, there is no description about what SKB are planning to do about these uncertainties. It is not enough only to discuss the reliability of the models. One should also discuss the comparison between the different conceptual models that are available for the specific problem. In addition, one should also try to invalidate the models, i.e. showing where they are definitely wrong. All models have a validity range and it is important to identify and describe this range. The simplest method is probably to solve the problem with different models, then discuss the differences and finally draw the conclusions based on those discussions. The discussions made regarding conditioned and unconditioned probabilities are reasonable. My only problem is how to put a figure on 'consequence'. This has not been described in SR 97 and thus I assume that they are based on subjective judgements. The use of probabilistic methods in SR 97 is exemplified by determination of the solubilities for the different radionuclides. There it is stated that 'It has been extremely difficult to give a reliable distribution of possible values for the solubilities'. Although this is completely true, the preferred method in such a case should be to use an uniform distribution for a wide interval and thus increase the probability that 'points' in the tails of the distributions are included, see the discussion

  5. A study on the propagation of measurement uncertainties into the result on a turbine performance test

    Cho, Soo Yong; Park, Chan Woo

    2004-01-01

    Uncertainties generated from the individual measured variables have an influence on the uncertainty of the experimental result through a data reduction equation. In this study, a performance test of a single stage axial type turbine is conducted, and total-to-total efficiencies are measured at the various off-design points in the low pressure and cold state. Based on an experimental apparatus, a data reduction equation for turbine efficiency is formulated and six measured variables are selected. Codes are written to calculate the efficiency, the uncertainty of the efficiency, and the sensitivity of the efficiency uncertainty by each of the measured quantities. The influence of each measured variable on the experimental result is figured out. Results show that the largest Uncertainty Magnification Factor (UMF) value is obtained by the inlet total pressure among the six measured variables, and its value is always greater than one. The UMF values of the inlet total temperature, the torque, and the RPM are always one. The Uncertainty Percentage Contribution (UPC) of the RPM shows the lowest influence on the uncertainty of the turbine efficiency, but the UPC of the torque has the largest influence to the result among the measured variables. These results are applied to find the correct direction for meeting an uncertainty requirement of the experimental result in the planning or development phase of experiment, and also to offer ideas for preparing a measurement system in the planning phase

  6. Evaluation and uncertainty analysis of regional-scale CLM4.5 net carbon flux estimates

    Post, Hanna; Hendricks Franssen, Harrie-Jan; Han, Xujun; Baatz, Roland; Montzka, Carsten; Schmidt, Marius; Vereecken, Harry

    2018-01-01

    Modeling net ecosystem exchange (NEE) at the regional scale with land surface models (LSMs) is relevant for the estimation of regional carbon balances, but studies on it are very limited. Furthermore, it is essential to better understand and quantify the uncertainty of LSMs in order to improve them. An important key variable in this respect is the prognostic leaf area index (LAI), which is very sensitive to forcing data and strongly affects the modeled NEE. We applied the Community Land Model (CLM4.5-BGC) to the Rur catchment in western Germany and compared estimated and default ecological key parameters for modeling carbon fluxes and LAI. The parameter estimates were previously estimated with the Markov chain Monte Carlo (MCMC) approach DREAM(zs) for four of the most widespread plant functional types in the catchment. It was found that the catchment-scale annual NEE was strongly positive with default parameter values but negative (and closer to observations) with the estimated values. Thus, the estimation of CLM parameters with local NEE observations can be highly relevant when determining regional carbon balances. To obtain a more comprehensive picture of model uncertainty, CLM ensembles were set up with perturbed meteorological input and uncertain initial states in addition to uncertain parameters. C3 grass and C3 crops were particularly sensitive to the perturbed meteorological input, which resulted in a strong increase in the standard deviation of the annual NEE sum (σ ∑ NEE) for the different ensemble members from ˜ 2 to 3 g C m-2 yr-1 (with uncertain parameters) to ˜ 45 g C m-2 yr-1 (C3 grass) and ˜ 75 g C m-2 yr-1 (C3 crops) with perturbed forcings. This increase in uncertainty is related to the impact of the meteorological forcings on leaf onset and senescence, and enhanced/reduced drought stress related to perturbation of precipitation. The NEE uncertainty for the forest plant functional type (PFT) was considerably lower (σ ∑ NEE ˜ 4.0-13.5 g C

  7. The influence of uncertainties of measurements in laboratory performance evaluation using an intercomparison program of radionuclide assays in environmental samples

    Tauhata, Luiz; Elizabeth Couto Machado Vianna, Maria; Eduardo de Oliveira, Antonio; Cristina de Melo Ferreira, Ana; Julia Camara da Silva Braganca, Maura; Faria Clain, Almir

    2006-01-01

    To show the influence of measurement uncertainties in performance evaluation of laboratories, data from 42 comparison runs were evaluated using two statistical criteria. The normalized standard deviation, D, used by US EPA, that mainly takes into account the accuracy, and the normalized deviation, E, that includes the individual laboratory uncertainty used for performance evaluation in the key-comparisons by BIPM. The results show that data evaluated by the different criteria give a significant deviation of laboratory performance in each radionuclide assay when we analyse a large quantity of data

  8. Analysis and evaluation of regulatory uncertainties in 10 CFR 60 subparts B and E

    Weiner, R.F.; Patrick, W.C.

    1990-01-01

    This paper presents an attribute analysis scheme for prioritizing the resolution of regulatory uncertainties. Attributes are presented which assist in identifying the need for timeliness and durability of the resolution of an uncertainty

  9. Uncertainty evaluation for three-dimensional scanning electron microscope reconstructions based on the stereo-pair technique

    Carli, L; Cantatore, A; De Chiffre, L; Genta, G; Barbato, G; Levi, R

    2011-01-01

    3D-SEM is a method, based on the stereophotogrammetry technique, which obtains three-dimensional topographic reconstructions starting typically from two SEM images, called the stereo-pair. In this work, a theoretical uncertainty evaluation of the stereo-pair technique, according to GUM (Guide to the Expression of Uncertainty in Measurement), was carried out, considering 3D-SEM reconstructions of a wire gauge with a reference diameter of 250 µm. Starting from the more commonly used tilting strategy, one based on the item rotation inside the SEM chamber was also adopted. The latter enables multiple-view reconstructions of the cylindrical item under consideration. Uncertainty evaluation was performed starting from a modified version of the Piazzesi equation, enabling the calculation of the z-coordinate from a given stereo-pair. The metrological characteristics of each input variable have been taken into account and a SEM stage calibration has been performed. Uncertainty tables for the cases of tilt and rotation were then produced, leading to the calculation of expanded uncertainty. For the case of rotation, the largest uncertainty contribution resulted to be the rotational angle; however, for the case of tilt it resulted to be the pixel size. A relative expanded uncertainty equal to 5% and 4% was obtained for the case of rotation and tilt, respectively

  10. Uncertainty Analysis for the Evaluation of a Passive Runway Arresting System

    Deloach, Richard; Marlowe, Jill M.; Yager, Thomas J.

    2009-01-01

    This paper considers the stopping distance of an aircraft involved in a runway overrun incident when the runway has been provided with an extension comprised of a material engineered to induce high levels of rolling friction and drag. A formula for stopping distance is derived that is shown to be the product of a known formula for the case of friction without drag, and a dimensionless constant between 0 and 1 that quantifies the further reduction in stopping distance when drag is introduced. This additional quantity, identified as the Drag Reduction Factor, D, is shown to depend on the ratio of drag force to friction force experienced by the aircraft as it enters the overrun area. The specific functional form of D is shown to depend on how drag varies with speed. A detailed uncertainty analysis is presented which reveals how the uncertainty in estimates of stopping distance are influenced by experimental error in the force measurements that are acquired in a typical evaluation experiment conducted to assess candidate overrun materials.

  11. Transforming Medical Assessment: Integrating Uncertainty Into the Evaluation of Clinical Reasoning in Medical Education.

    Cooke, Suzette; Lemay, Jean-Francois

    2017-06-01

    In an age where practicing physicians have access to an overwhelming volume of clinical information and are faced with increasingly complex medical decisions, the ability to execute sound clinical reasoning is essential to optimal patient care. The authors propose two concepts that are philosophically paramount to the future assessment of clinical reasoning in medicine: assessment in the context of "uncertainty" (when, despite all of the information that is available, there is still significant doubt as to the best diagnosis, investigation, or treatment), and acknowledging that it is entirely possible (and reasonable) to have more than "one correct answer." The purpose of this article is to highlight key elements related to these two core concepts and discuss genuine barriers that currently exist on the pathway to creating such assessments. These include acknowledging situations of uncertainty, creating clear frameworks that define progressive levels of clinical reasoning skills, providing validity evidence to increase the defensibility of such assessments, considering the comparative feasibility with other forms of assessment, and developing strategies to evaluate the impact of these assessment methods on future learning and practice. The authors recommend that concerted efforts be directed toward these key areas to help advance the field of clinical reasoning assessment, improve the clinical care decisions made by current and future physicians, and have positive outcomes for patients. It is anticipated that these and subsequent efforts will aid in reaching the goal of making future assessment in medical education more representative of current-day clinical reasoning and decision making.

  12. Incorporating reliability evaluation into the uncertainty analysis of electricity market price

    Kang, Chongqing; Bai, Lichao; Xia, Qing; Jiang, Jianjian; Zhao, Jing

    2005-01-01

    A novel model and algorithm for analyzing the uncertainties in electricity market is proposed in this paper. In this model, bidding decision is formulated as a probabilistic model that takes into account the decision-maker's willingness to bid, risk preferences, the fluctuation of fuel-price, etc. At the same time, generating unit's uncertain output model is considered by its forced outage rate (FOR). Based on the model, the uncertainty of market price is then analyzed. Taking the analytical results into consideration, not only the reliability of the power system can be conventionally analyzed, but also the possible distribution of market prices can be easily obtained. The probability distribution of market prices can be further used to calculate the expected output and the sales income of generating unit in the market. Based on these results, it is also possible to evaluate the risk involved by generating units. A simple system with four generating units is used to illustrate the proposed algorithm. The proposed algorithm and the modeling technique are expected to helpful to the market participants in making their economic decisions

  13. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  14. Measurement uncertainty and gauge capability of surface roughness measurements in the automotive industry: a case study

    Drégelyi-Kiss, Ágota; Czifra, Árpád

    2014-01-01

    The calculation methods of the capability of measurement processes in the automotive industry differ from each other. There are three main calculation methods: MSA, VDA 5 and the international standard, ISO 22514–7. During this research our aim was to compare the capability calculation methods in a case study. Two types of automotive parts (ten pieces of each) are chosen to examine the behaviour of the manufacturing process and to measure the required characteristics of the measurement process being evaluated. The measurement uncertainty of the measuring process is calculated according to the VDA 5 and ISO 22514–7, and MSA guidelines. In this study the conformance of a measurement process in an automotive manufacturing process is determined, and the similarities and the differences between the methods used are shown. (paper)

  15. On the evaluation of a fuel assembly design by means of uncertainty and sensitivity measures

    Jaeger, Wadim; Sanchez Espinoza, Victor Hugo

    2012-01-01

    This paper will provide results of an uncertainty and sensitivity study in order to calculate parameters of safety related importance like the fuel centerline temperature, the cladding temperature and the fuel assembly pressure drop of a lead-alloy cooled fast system. Applying best practice guidelines, a list of uncertain parameters has been identified. The considered parameter variations are based on the experience gained during fabrication and operation of former and existing liquid metal cooled fast systems as well as on experimental results and on engineering judgment. (orig.)

  16. On the evaluation of a fuel assembly design by means of uncertainty and sensitivity measures

    Jaeger, Wadim; Sanchez Espinoza, Victor Hugo [Karlsruhe Institute of Technology (KIT), Eggenstein-Leopoldshafen (Germany). Inst. for Neutron Physics and Reactor Technology

    2012-11-15

    This paper will provide results of an uncertainty and sensitivity study in order to calculate parameters of safety related importance like the fuel centerline temperature, the cladding temperature and the fuel assembly pressure drop of a lead-alloy cooled fast system. Applying best practice guidelines, a list of uncertain parameters has been identified. The considered parameter variations are based on the experience gained during fabrication and operation of former and existing liquid metal cooled fast systems as well as on experimental results and on engineering judgment. (orig.)

  17. Uncertainty quantification and sensitivity analysis of an arterial wall mechanics model for evaluation of vascular drug therapies.

    Heusinkveld, Maarten H G; Quicken, Sjeng; Holtackers, Robert J; Huberts, Wouter; Reesink, Koen D; Delhaas, Tammo; Spronck, Bart

    2018-02-01

    Quantification of the uncertainty in constitutive model predictions describing arterial wall mechanics is vital towards non-invasive assessment of vascular drug therapies. Therefore, we perform uncertainty quantification to determine uncertainty in mechanical characteristics describing the vessel wall response upon loading. Furthermore, a global variance-based sensitivity analysis is performed to pinpoint measurements that are most rewarding to be measured more precisely. We used previously published carotid diameter-pressure and intima-media thickness (IMT) data (measured in triplicate), and Holzapfel-Gasser-Ogden models. A virtual data set containing 5000 diastolic and systolic diameter-pressure points, and IMT values was generated by adding measurement error to the average of the measured data. The model was fitted to single-exponential curves calculated from the data, obtaining distributions of constitutive parameters and constituent load bearing parameters. Additionally, we (1) simulated vascular drug treatment to assess the relevance of model uncertainty and (2) evaluated how increasing the number of measurement repetitions influences model uncertainty. We found substantial uncertainty in constitutive parameters. Simulating vascular drug treatment predicted a 6% point reduction in collagen load bearing ([Formula: see text]), approximately 50% of its uncertainty. Sensitivity analysis indicated that the uncertainty in [Formula: see text] was primarily caused by noise in distension and IMT measurements. Spread in [Formula: see text] could be decreased by 50% when increasing the number of measurement repetitions from 3 to 10. Model uncertainty, notably that in [Formula: see text], could conceal effects of vascular drug therapy. However, this uncertainty could be reduced by increasing the number of measurement repetitions of distension and wall thickness measurements used for model parameterisation.

  18. Uncertainties in Early Stage Capital Cost Estimation of Process Design – A case study on biorefinery design

    Gurkan eSin

    2015-02-01

    Full Text Available Capital investment, next to the product demand, sales and production costs, is one of the key metrics commonly used for project evaluation and feasibility assessment. Estimating the investment costs of a new product/process alternative during early stage design is a challenging task. This is especially important in biorefinery research, where available information and experiences with new technologies is limited. A systematic methodology for uncertainty analysis of cost data is proposed that employs (a Bootstrapping as a regression method when cost data is available and (b the Monte Carlo technique as an error propagation method based on expert input when cost data is not available. Four well-known models for early stage cost estimation are reviewed an analyzed using the methodology. The significance of uncertainties of cost data for early stage process design is highlighted using the synthesis and design of a biorefinery as a case study. The impact of uncertainties in cost estimation on the identification of optimal processing paths is found to be profound. To tackle this challenge, a comprehensive techno-economic risk analysis framework is presented to enable robust decision making under uncertainties. One of the results using an order-of-magnitude estimate shows that the production of diethyl ether and 1,3-butadiene are the most promising with economic risks of 0.24 MM$/a and 4.6 MM$/a due to uncertainties in cost estimations, respectively.

  19. Determination of total arsenic in fish by hydride-generation atomic absorption spectrometry: method validation, traceability and uncertainty evaluation

    Nugraha, W. C.; Elishian, C.; Ketrin, R.

    2017-03-01

    Fish containing arsenic compound is one of the important indicators of arsenic contamination in water monitoring. The high level of arsenic in fish is due to absorption through food chain and accumulated in their habitat. Hydride generation (HG) coupled with atomic absorption spectrometric (AAS) detection is one of the most popular techniques employed for arsenic determination in a variety of matrices including fish. This study aimed to develop a method for the determination of total arsenic in fish by HG-AAS. The method for sample preparation from American of Analytical Chemistry (AOAC) Method 999.10-2005 was adopted for acid digestion using microwave digestion system and AOAC Method 986.15 - 2005 for dry ashing. The method was developed and validated using Certified Reference Material DORM 3 Fish Protein for trace metals for ensuring the accuracy and the traceability of the results. The sources of uncertainty of the method were also evaluated. By using the method, it was found that the total arsenic concentration in the fish was 45.6 ± 1.22 mg.Kg-1 with a coverage factor of equal to 2 at 95% of confidence level. Evaluation of uncertainty was highly influenced by the calibration curve. This result was also traceable to International Standard System through analysis of Certified Reference Material DORM 3 with 97.5% of recovery. In summary, it showed that method of preparation and HG-AAS technique for total arsenic determination in fish were valid and reliable.

  20. Mechanical property test of natural rubber bearing for the evaluation of uncertainty value of seismic isolation devices

    Kim, Min Kyu; Kim, Jung Han; Choi, In Kil

    2012-01-01

    Seismic safety of NPP is one of the most important issues in a nuclear field after great east Japan earthquake in 2011. For the improvement of seismic safety of nuclear power plant, seismic isolation is the easiest solution for increasing the seismic safety. Otherwise, the application of seismic isolation devices for nuclear power plants doesn't make the seismic risk of NPP increases always. The rubber bearing have many uncertainties of material properties and large displacement should absorb according to the application of isolation devices. In this study, for the evaluation of uncertainty of the material properties of rubber bearing, material tests for rubber and mechanical properties test for natural rubber bearing were performed. For the evaluation of effect of hardness of rubber, 4 kinds of rubber hardness for material property tests and 2 kinds of rubber hardness for mechanical property test were considered. As a result, the variation of material properties is higher than that of mechanical properties of natural rubber bearings

  1. Quantifying reactor safety margins: Application of code scaling, applicability, and uncertainty evaluation methodology to a large-break, loss-of-coolant accident

    Boyack, B.; Duffey, R.; Wilson, G.; Griffith, P.; Lellouche, G.; Levy, S.; Rohatgi, U.; Wulff, W.; Zuber, N.

    1989-12-01

    The US Nuclear Regulatory Commission (NRC) has issued a revised rule for loss-of-coolant accident/emergency core cooling system (ECCS) analysis of light water reactors to allow the use of best-estimate computer codes in safety analysis as an option. A key feature of this option requires the licensee to quantify the uncertainty of the calculations and include that uncertainty when comparing the calculated results with acceptance limits provided in 10 CFR Part 50. To support the revised ECCS rule and illustrate its application, the NRC and its contractors and consultants have developed and demonstrated an uncertainty evaluation methodology called code scaling, applicability, and uncertainty (CSAU). The CSAU methodology and an example application described in this report demonstrate that uncertainties in complex phenomena can be quantified. The methodology is structured, traceable, and practical, as is needed in the regulatory arena. The methodology is systematic and comprehensive as it addresses and integrates the scenario, experiments, code, and plant to resolve questions concerned with: (a) code capability to scale-up processes from test facility to full-scale nuclear power plants; (b) code applicability to safety studies of a postulated accident scenario in a specified nuclear power plant; and (c) quantifying uncertainties of calculated results. 127 refs., 55 figs., 40 tabs

  2. Use of 2D/3D data for peak cladding temperature uncertainty studies

    Boyack, B.E.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems. The revised rule allows emergency core cooling system analysis based on best-estimate methods, provided uncertainties in the prediction of prescribed acceptance limits are quantified and reported. To support the revised rule, the NRC developed the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. Data from the 2D/3D program have been used in a demonstration of the CSAU methodology in two ways. First, the data were used to identify and quantify biases that are related to the implementation of selected correlations and models in the thermal-hydraulic systems code TRAC-PF1/MOD1 as it is used to calculate the demonstration transient, a large-break loss-of-coolant accident. Second, the data were used in a supportive role to provide insight into the accuracy of code calculations and to confirm conclusions that are drawn regarding specific CSAU studies. Examples are provided illustrating each of these two uses of 2D/3D data. 9 refs., 7 figs

  3. Comparação de diferentes abordagens para avaliação da incerteza na cromatografia gasosa do gás natural Comparison of different approaches to evaluate the uncertainty of gas chromatography for natural gas

    Elcio Cruz de Oliveira

    2009-01-01

    Full Text Available The evaluation of uncertainty associated with an analytic result is an essential part of the measurement process. Recently, several approaches to evaluate the uncertainty in measurement have been developed. Here, the gas chromatography assay uncertainty for natural gas is compared by some of these approaches: the guide to the expression of uncertainty in measurement (GUM approach, top-down approach (reproducibility estimate from an inter-laboratory study, Barwick & Ellison (data from validation, study of variability and fuzzy approach. The comparison shows that GUM, Barwick & Ellison and fuzzy approaches lead to comparable uncertainty evaluations, which does not happen with the top-down approach and study of variability by the absence of data normality.

  4. Using expanded real options analysis to evaluate capacity expansion decisions under uncertainty in the construction material industry

    Momani, Amer Mohammad

    2016-08-01

    Full Text Available Capacity expansion generally requires large capital expenditure on illiquid assets. Therefore, decisions to enlarge capacity must support the organisation’s strategic objectives and provide valuable input for the budgeting process. This paper applies an expanded form of Real Options Analysis (ROA to generate and evaluate capacity expansion strategies under uncertainty in the construction material industry. ROA is applied to different expansion strategies associated with different demand scenarios. Evaluating a wider variety of strategies can reduce risk and sponsor decisions that maximise the firm’s value. The case study shows that the execution of a lead expansion strategy with 10-year intervals under a 50 per cent demand satisfaction scenario produces superior results.

  5. Evaluation of the uncertainties in the TLD radiosurgery postal dose system

    Campos, L. T.; Leite, S. P.; de Almeida, C. E. V.; Magalhães, L. A. G.

    2018-03-01

    Stereotactic radiosurgery is a single-fraction radiation therapy procedure for treating intracranial lesions using a stereotactic apparatus and multiple narrow beams delivered through noncoplanar isocentric arcs. To guarantee a high quality standard, a comprehensive Quality Assurance programme is extremely important to ensure that the measured dose is consistent with the tolerance considered to improve treatment quality. The Radiological Science Laboratory operates a postal audit programme in SRT and SRS. The purpose of the programme is to verify the target localization accuracy in known geometry and the dosimetric conditions of the TPS. The programme works in such a way those thermoluminescence dosimeters, consisting of LiF chips, are sent to the centre where they are to be irradiated to a certain dose. The TLD are then returned, where they are evaluated and the absorbed dose is obtained from TLDs readings. The aim of the present work is estimate the uncertainties in the process of dose determination, using experimental data.

  6. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample.

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong

    2014-08-01

    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. A Generalized Kruskal-Wallis Test Incorporating Group Uncertainty with Application to Genetic Association Studies

    Acar, Elif F.; Sun, Lei

    2012-01-01

    Motivated by genetic association studies of SNPs with genotype uncertainty, we propose a generalization of the Kruskal-Wallis test that incorporates group uncertainty when comparing k samples. The extended test statistic is based on probability-weighted rank-sums and follows an asymptotic chi-square distribution with k-1 degrees of freedom under the null hypothesis. Simulation studies confirm the validity and robustness of the proposed test in finite samples. Application to a genome-wide asso...

  8. Evaluation of Uncertainty of IMRT QA Using 2 Dimensional Array Detector for Head and Neck Patients

    Ban, Tae Joon; Lee, Woo Suk; Kim, Dae Sup; Baek, Geum Mun; Kwak, Jung Won

    2011-01-01

    IMRT QA using 2 Dimensional array detector is carried out with condition for discrete dose distribution clinically. And it can affect uncertainty of evaluation using gamma method. We analyze gamma index variation according to grid size and suggest validate range of grid size for IMRT QA in Hospital. We performed QA using OniPro I'mRT system software version 1.7b on 10 patients (head and neck) for IMRT. The reference dose plane (grid size, 0.1 cm; location, [0, 0, 0]) from RTP was compared with the dose plane that has different grid size (0.1 cm, 0.5 cm, 1.0 cm, 2.0 cm, 4.0 cm) and different location (along Y-axis 0 cm, 0.2 cm, 0.5 cm, 1.0 cm). The gamma index variation was evaluated by observing the level of changes in Gamma pass rate, Average signal, Standard deviation for each case. The average signal for each grid size showed difference levels of 0%, -0.19%, -0.04%, -0.46%, -8.32% and the standard deviation for each grid size showed difference levels of 0%, -0.30%, 1.24%, -0.70%, -7.99%. The gamma pass rate for each grid size showed difference levels of 0%, 0.27%, -1.43%, 5.32%, 5.60%. The gamma evaluation results according to distance in grid size range of 0.1 cm to 1.0 cm showed good agreement with reference condition (grid size 0.1 cm) within 1.5% and over 5% in case of the grid size was greater than 2.0 cm. We recognize that the grid size of gamma evaluation can make errors of IMRT QA. So we have to consider uncertainty of gamma evaluation according to the grid size and apply smaller than 2 cm grid size to reduce error and increase accuracy clinically.

  9. Treatment of uncertainty through the interval smart/swing weighting method: a case study

    Luiz Flávio Autran Monteiro Gomes

    2011-12-01

    Full Text Available An increasingly competitive market means that many decisions must be taken, quickly and with precision, in complex, high risk scenarios. This combination of factors makes it necessary to use decision aiding methods which provide a means of dealing with uncertainty in the judgement of the alternatives. This work presents the use of the MAUT method, combined with the INTERVAL SMART/SWING WEIGHTING method. Although multicriteria decision aiding was not conceived specifically for tackling uncertainty, the combined use of MAUT and the INTERVAL SMART/SWING WEIGHTING method allows approaching decision problems under uncertainty. The main concepts which are involved in these two methods are described and their joint application to the case study concerning the selection of a printing service supplier is presented. The case study makes use of the WINPRE software as a support tool for the calculation of dominance. It is then concluded that the proposed approach can be applied to decision making problems under uncertainty.

  10. Demand and generation cost uncertainty modelling in power system optimization studies

    Gomes, Bruno Andre; Saraiva, Joao Tome [INESC Porto and Departamento de Engenharia Electrotecnica e Computadores, Faculdade de Engenharia da Universidade do Porto, FEUP, Campus da FEUP Rua Roberto Frias 378, 4200 465 Porto (Portugal)

    2009-06-15

    This paper describes the formulations and the solution algorithms developed to include uncertainties in the generation cost function and in the demand on DC OPF studies. The uncertainties are modelled by trapezoidal fuzzy numbers and the solution algorithms are based on multiparametric linear programming techniques. These models are a development of an initial formulation detailed in several publications co-authored by the second author of this paper. Now, we developed a more complete model and a more accurate solution algorithm in the sense that it is now possible to capture the widest possible range of values of the output variables reflecting both demand and generation cost uncertainties. On the other hand, when modelling simultaneously demand and generation cost uncertainties, we are representing in a more realistic way the volatility that is currently inherent to power systems. Finally, the paper includes a case study to illustrate the application of these models based on the IEEE 24 bus test system. (author)

  11. Regional climate change trends and uncertainty analysis using extreme indices: A case study of Hamilton, Canada

    Razavi, Tara; Switzman, Harris; Arain, Altaf; Coulibaly, Paulin

    2016-01-01

    This study aims to provide a deeper understanding of the level of uncertainty associated with the development of extreme weather frequency and intensity indices at the local scale. Several different global climate models, downscaling methods, and emission scenarios were used to develop extreme temperature and precipitation indices at the local scale in the Hamilton region, Ontario, Canada. Uncertainty associated with historical and future trends in extreme indices and future climate projectio...

  12. Evaluation of Uncertainty and Sensitivity in Environmental Modeling at a Radioactive Waste Management Site

    Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.

    2002-05-01

    Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more

  13. Methods to estimate the between‐study variance and its uncertainty in meta‐analysis†

    Jackson, Dan; Viechtbauer, Wolfgang; Bender, Ralf; Bowden, Jack; Knapp, Guido; Kuss, Oliver; Higgins, Julian PT; Langan, Dean; Salanti, Georgia

    2015-01-01

    Meta‐analyses are typically used to estimate the overall/mean of an outcome of interest. However, inference about between‐study variability, which is typically modelled using a between‐study variance parameter, is usually an additional aim. The DerSimonian and Laird method, currently widely used by default to estimate the between‐study variance, has been long challenged. Our aim is to identify known methods for estimation of the between‐study variance and its corresponding uncertainty, and to summarise the simulation and empirical evidence that compares them. We identified 16 estimators for the between‐study variance, seven methods to calculate confidence intervals, and several comparative studies. Simulation studies suggest that for both dichotomous and continuous data the estimator proposed by Paule and Mandel and for continuous data the restricted maximum likelihood estimator are better alternatives to estimate the between‐study variance. Based on the scenarios and results presented in the published studies, we recommend the Q‐profile method and the alternative approach based on a ‘generalised Cochran between‐study variance statistic’ to compute corresponding confidence intervals around the resulting estimates. Our recommendations are based on a qualitative evaluation of the existing literature and expert consensus. Evidence‐based recommendations require an extensive simulation study where all methods would be compared under the same scenarios. © 2015 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. PMID:26332144

  14. Evaluation of purity with its uncertainty value in high purity lead stick by conventional and electro-gravimetric methods.

    Singh, Nahar; Singh, Niranjan; Tripathy, S Swarupa; Soni, Daya; Singh, Khem; Gupta, Prabhat K

    2013-06-26

    A conventional gravimetry and electro-gravimetry study has been carried out for the precise and accurate purity determination of lead (Pb) in high purity lead stick and for preparation of reference standard. Reference materials are standards containing a known amount of an analyte and provide a reference value to determine unknown concentrations or to calibrate analytical instruments. A stock solution of approximate 2 kg has been prepared after dissolving approximate 2 g of Pb stick in 5% ultra pure nitric acid. From the stock solution five replicates of approximate 50 g have been taken for determination of purity by each method. The Pb has been determined as PbSO4 by conventional gravimetry, as PbO2 by electro gravimetry. The percentage purity of the metallic Pb was calculated accordingly from PbSO4 and PbO2. On the basis of experimental observations it has been concluded that by conventional gravimetry and electro-gravimetry the purity of Pb was found to be 99.98 ± 0.24 and 99.97 ± 0.27 g/100 g and on the basis of Pb purity the concentration of reference standard solutions were found to be 1000.88 ± 2.44 and 1000.81 ± 2.68 mg kg-1 respectively with 95% confidence level (k = 2). The uncertainty evaluation has also been carried out in Pb determination following EURACHEM/GUM guidelines. The final analytical results quantifying uncertainty fulfills this requirement and gives a measure of the confidence level of the concerned laboratory. Gravimetry is the most reliable technique in comparison to titremetry and instrumental method and the results of gravimetry are directly traceable to SI unit. Gravimetric analysis, if methods are followed carefully, provides for exceedingly precise analysis. In classical gravimetry the major uncertainties are due to repeatability but in electro-gravimetry several other factors also affect the final results.

  15. The uncertainty cascade in flood risk assessment under changing climatic conditions - the Biala Tarnowska case study

    Doroszkiewicz, Joanna; Romanowicz, Renata

    2016-04-01

    Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the

  16. Forensic Entomology: Evaluating Uncertainty Associated With Postmortem Interval (PMI) Estimates With Ecological Models.

    Faris, A M; Wang, H-H; Tarone, A M; Grant, W E

    2016-05-31

    Estimates of insect age can be informative in death investigations and, when certain assumptions are met, can be useful for estimating the postmortem interval (PMI). Currently, the accuracy and precision of PMI estimates is unknown, as error can arise from sources of variation such as measurement error, environmental variation, or genetic variation. Ecological models are an abstract, mathematical representation of an ecological system that can make predictions about the dynamics of the real system. To quantify the variation associated with the pre-appearance interval (PAI), we developed an ecological model that simulates the colonization of vertebrate remains by Cochliomyia macellaria (Fabricius) (Diptera: Calliphoridae), a primary colonizer in the southern United States. The model is based on a development data set derived from a local population and represents the uncertainty in local temperature variability to address PMI estimates at local sites. After a PMI estimate is calculated for each individual, the model calculates the maximum, minimum, and mean PMI, as well as the range and standard deviation for stadia collected. The model framework presented here is one manner by which errors in PMI estimates can be addressed in court when no empirical data are available for the parameter of interest. We show that PAI is a potential important source of error and that an ecological model is one way to evaluate its impact. Such models can be re-parameterized with any development data set, PAI function, temperature regime, assumption of interest, etc., to estimate PMI and quantify uncertainty that arises from specific prediction systems. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Quantifying the uncertainty of wave energy conversion device cost for policy appraisal: An Irish case study

    Farrell, Niall; Donoghue, Cathal O’; Morrissey, Karyn

    2015-01-01

    Wave Energy Conversion (WEC) devices are at a pre-commercial stage of development with feasibility studies sensitive to uncertainties surrounding assumed input costs. This may affect decision making. This paper analyses the impact these uncertainties may have on investor, developer and policymaker decisions using an Irish case study. Calibrated to data present in the literature, a probabilistic methodology is shown to be an effective means to carry this out. Value at Risk (VaR) and Conditional Value at Risk (CVaR) metrics are used to quantify the certainty of achieving a given cost or return on investment. We analyse the certainty of financial return provided by the proposed Irish Feed-in Tariff (FiT) policy. The influence of cost reduction through bulk discount is also discussed, with cost reduction targets for developers identified. Uncertainty is found to have a greater impact on the profitability of smaller installations and those subject to lower rates of cost reduction. This paper emphasises that a premium is required to account for cost uncertainty when setting FiT rates. By quantifying uncertainty, a means to specify an efficient premium is presented. - Highlights: • Probabilistic model quantifies uncertainty for wave energy feasibility analyses. • Methodology presented and applied to an Irish case study. • A feed-in tariff premium of 3–4 c/kWh required to account for cost uncertainty. • Sensitivity of uncertainty and cost to rates of technological change analysed. • Use of probabilistic model for investors and developers also demonstrated

  18. Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies (Final Report)

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physi...

  19. [The metrology of uncertainty: a study of vital statistics from Chile and Brazil].

    Carvajal, Yuri; Kottow, Miguel

    2012-11-01

    This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.

  20. Evaluation of the uncertainty associated with sample holders in NAA measurements in LAN/IPEN

    Zahn, Guilherme S.; Ticianelli, Regina B.; Saiki, Mitiko; Genezini, Frederico A., E-mail: ticianelli@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    In IPEN's Neutron Activation Laboratory (LAN/IPEN), thin stainless steel sample holders are used for gamma spectrometry in NAA measurements. This material is very practical, but its chemical composition may be troublesome, as it presents large amounts of elements with intermediate atomic number, with attenuation factors for low-energy gamma-rays that must not be neglected. In this study, count rates obtained using different sample holders were compared. To accomplish that, an Am-241 source, with 59-keV gamma emission, was used so that low-energy gamma attenuation differences can be determined. Moreover, in order to study the energy dependence of these differences, a Ho-166m source was also used. From these results, it was possible to analyze the experimental error associated to the variations between sample holders, with the aim of introducing an addictive term to the uncertainty analysis of comparative Neutron Activation Analysis results. (author)

  1. Reprocessing decision: a study in policymaking under uncertainty

    Heising, C.D.

    1978-01-01

    The U.S. reprocessing decision is examined in this thesis. Decision analysis is applied to develop a rational framework for the assessment of policy alternatives. Benefits and costs for each alternative are evaluated and compared in dollar terms to determine the optimal decision. A fuel cycle simulation model is constructed to assess the economic value of reprocessing light water reactor (LWR) spent fuel and recycling plutonium. In addition, a dynamic fuel substitution model is used to estimate the economic effects of the reprocessing decision's influence on the introduction date of the liquid metal fast breeder reactor (LMFBR). Risks estimated in dollar terms for comparison with the economic values include those related to health, the environment and safety, nuclear theft and sabotage, and nuclear proliferation

  2. Evaluation and correction of uncertainty due to Gaussian approximation in radar - rain gauge merging using kriging with external drift

    Cecinati, F.; Wani, O.; Rico-Ramirez, M. A.

    2016-12-01

    It is widely recognised that merging radar rainfall estimates (RRE) with rain gauge data can improve the RRE and provide areal and temporal coverage that rain gauges cannot offer. Many methods to merge radar and rain gauge data are based on kriging and require an assumption of Gaussianity on the variable of interest. In particular, this work looks at kriging with external drift (KED), because it is an efficient, widely used, and well performing merging method. Rainfall, especially at finer temporal scale, does not have a normal distribution and presents a bi-modal skewed distribution. In some applications a Gaussianity assumption is made, without any correction. In other cases, variables are transformed in order to obtain a distribution closer to Gaussian. This work has two objectives: 1) compare different transformation methods in merging applications; 2) evaluate the uncertainty arising when untransformed rainfall data is used in KED. The comparison of transformation methods is addressed under two points of view. On the one hand, the ability to reproduce the original probability distribution after back-transformation of merged products is evaluated with qq-plots, on the other hand the rainfall estimates are compared with an independent set of rain gauge measurements. The tested methods are 1) no transformation, 2) Box-Cox transformations with parameter equal to λ=0.5 (square root), 3) λ=0.25 (square root - square root), and 4) λ=0.1 (almost logarithmic), 5) normal quantile transformation, and 6) singularity analysis. The uncertainty associated with the use of non-transformed data in KED is evaluated in comparison with the best performing product. The methods are tested on a case study in Northern England, using hourly data from 211 tipping bucket rain gauges from the Environment Agency and radar rainfall data at 1 km/5-min resolutions from the UK Met Office. In addition, 25 independent rain gauges from the UK Met Office were used to assess the merged products.

  3. OECD/CSNI Workshop on Best Estimate Methods and Uncertainty Evaluations - Workshop Proceedings

    2013-01-01

    Best-Estimate Methods plus Uncertainty Evaluation are gaining increased interest in the licensing process. On the other hand, lessons learnt from the BEMUSE (NEA/CSNI/R(2011)3) and SM2A (NEA/CSNI/R(2011)3) benchmarks, progress of UAM benchmark, and answers to the WGAMA questionnaire on the Use of Best-Estimate Methodologies show that improvements of the present methods are necessary and new applications appear. The objective of this workshop is to provide a forum for a wide range of experts to exchange information in the area of best estimate analysis and uncertainty evaluation methods and address issues drawn-up from BEMUSE, UAM and SM2A activities. Both, improvement of existing methods and recent new developments are included. As a result of the workshop development, a set of recommendations, including lines for future activities were proposed. The organisation of the Workshop was divided into three parts: Opening session including key notes from OECD and IAEA representatives, Technical sessions, and a Wrap-up session. All sessions included a debate with participation from the audience constituted by 71 attendees. The workshop consisted of four technical sessions: a) Development achievements of BEPU methods and State of the Art: The objective of this session was to present the different approaches to deal with Best Estimate codes and uncertainties evaluations. A total of six papers were presented. One initial paper summarized the existing methods; the following open papers were focused on specific methods stressing their bases, peculiarities and advantages. As a result of the session a picture of the current State of the Art was obtained. b) International comparative activities: This session reviewed the set of international activities around the subject of BEPU methods benchmarking and development. From each of the activities a description of the objectives, development, main results, conclusions and recommendations (in case it is finalized) was presented. This

  4. Risk-based flood protection planning under climate change and modeling uncertainty: a pre-alpine case study

    Dittes, Beatrice; Kaiser, Maria; Špačková, Olga; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2018-05-01

    Planning authorities are faced with a range of questions when planning flood protection measures: is the existing protection adequate for current and future demands or should it be extended? How will flood patterns change in the future? How should the uncertainty pertaining to this influence the planning decision, e.g., for delaying planning or including a safety margin? Is it sufficient to follow a protection criterion (e.g., to protect from the 100-year flood) or should the planning be conducted in a risk-based way? How important is it for flood protection planning to accurately estimate flood frequency (changes), costs and damage? These are questions that we address for a medium-sized pre-alpine catchment in southern Germany, using a sequential Bayesian decision making framework that quantitatively addresses the full spectrum of uncertainty. We evaluate different flood protection systems considered by local agencies in a test study catchment. Despite large uncertainties in damage, cost and climate, the recommendation is robust for the most conservative approach. This demonstrates the feasibility of making robust decisions under large uncertainty. Furthermore, by comparison to a previous study, it highlights the benefits of risk-based planning over the planning of flood protection to a prescribed return period.

  5. Risk-based flood protection planning under climate change and modeling uncertainty: a pre-alpine case study

    B. Dittes

    2018-05-01

    Full Text Available Planning authorities are faced with a range of questions when planning flood protection measures: is the existing protection adequate for current and future demands or should it be extended? How will flood patterns change in the future? How should the uncertainty pertaining to this influence the planning decision, e.g., for delaying planning or including a safety margin? Is it sufficient to follow a protection criterion (e.g., to protect from the 100-year flood or should the planning be conducted in a risk-based way? How important is it for flood protection planning to accurately estimate flood frequency (changes, costs and damage? These are questions that we address for a medium-sized pre-alpine catchment in southern Germany, using a sequential Bayesian decision making framework that quantitatively addresses the full spectrum of uncertainty. We evaluate different flood protection systems considered by local agencies in a test study catchment. Despite large uncertainties in damage, cost and climate, the recommendation is robust for the most conservative approach. This demonstrates the feasibility of making robust decisions under large uncertainty. Furthermore, by comparison to a previous study, it highlights the benefits of risk-based planning over the planning of flood protection to a prescribed return period.

  6. [Evaluation of uncertainty for determination of tin and its compounds in air of workplace by flame atomic absorption spectrometry].

    Wei, Qiuning; Wei, Yuan; Liu, Fangfang; Ding, Yalei

    2015-10-01

    To investigate the method for uncertainty evaluation of determination of tin and its compounds in the air of workplace by flame atomic absorption spectrometry. The national occupational health standards, GBZ/T160.28-2004 and JJF1059-1999, were used to build a mathematical model of determination of tin and its compounds in the air of workplace and to calculate the components of uncertainty. In determination of tin and its compounds in the air of workplace using flame atomic absorption spectrometry, the uncertainty for the concentration of the standard solution, atomic absorption spectrophotometer, sample digestion, parallel determination, least square fitting of the calibration curve, and sample collection was 0.436%, 0.13%, 1.07%, 1.65%, 3.05%, and 2.89%, respectively. The combined uncertainty was 9.3%.The concentration of tin in the test sample was 0.132 mg/m³, and the expanded uncertainty for the measurement was 0.012 mg/m³ (K=2). The dominant uncertainty for determination of tin and its compounds in the air of workplace comes from least squares fitting of the calibration curve and sample collection. Quality control should be improved in the process of calibration curve fitting and sample collection.

  7. Monte-Carlo-based uncertainty propagation with hierarchical models—a case study in dynamic torque

    Klaus, Leonard; Eichstädt, Sascha

    2018-04-01

    For a dynamic calibration, a torque transducer is described by a mechanical model, and the corresponding model parameters are to be identified from measurement data. A measuring device for the primary calibration of dynamic torque, and a corresponding model-based calibration approach, have recently been developed at PTB. The complete mechanical model of the calibration set-up is very complex, and involves several calibration steps—making a straightforward implementation of a Monte Carlo uncertainty evaluation tedious. With this in mind, we here propose to separate the complete model into sub-models, with each sub-model being treated with individual experiments and analysis. The uncertainty evaluation for the overall model then has to combine the information from the sub-models in line with Supplement 2 of the Guide to the Expression of Uncertainty in Measurement. In this contribution, we demonstrate how to carry this out using the Monte Carlo method. The uncertainty evaluation involves various input quantities of different origin and the solution of a numerical optimisation problem.

  8. Evaluation of uncertainties in MUF for a LWR fuel fabrication plant. Pt.2 - Pt.4

    Mennerdahl, D.

    1984-09-01

    MUF (Material Unaccounted For) is a parameter defined as the estimated loss of materials during a certain period of time. A suitable method for uncertainty and bias estimations has been developed. The method was specifically adjusted for a facility like the ASEA-ATOM fuel fabrication plant. Operations that are expected to contribute to the uncertainties have been compiled. Information that is required for the application of the developed method is described. Proposals for simplification of the required information without losing the accuracy are suggested. ASEA-ATOM had earlier determined uncertainty data for the scales that are used for nuclear materials. The statistical uncertainties included random errors, short-term and long-term systematic errors. Information for the determination of biases was also determined (constants and formulas). The method proposed by ASEA-ATOM for the determination of uncertainties due to the scales is compatible with the method proposed in this report. For other operations than weighing, the information from ASEA-ATOM is limited. Such operations are completely dominating the total uncertainty in MUF. Examples of calculations of uncertainties and bias are given for uranium oxide powders in large containers. Examples emphasize the differences between various statistical errors (random and systematic errors) and biases (known errors). The importance of correlations between different items in the inventories is explained. A specific correlation of great importance is the use of nominal factors (uranium concentration). A portable personal computer can be used to determine uncertainties in MUF. (author)

  9. Estimating the uncertainty of the impact of climate change on alluvial aquifers. Case study in central Italy

    Romano, Emanuele; Camici, Stefania; Brocca, Luca; Moramarco, Tommaso; Pica, Federico; Preziosi, Elisabetta

    2014-05-01

    There is evidence that the precipitation pattern in Europe is trending towards more humid conditions in the northern region and drier conditions in the southern and central-eastern regions. However, a great deal of uncertainty concerns how the changes in precipitations will have an impact on water resources, particularly on groundwater, and this uncertainty should be evaluated on the basis of that coming from 1) future climate scenarios of Global Circulation Models (GCMs) and 2) modeling chains including the downscaling technique, the infiltration model and the calibration/validation procedure used to develop the groundwater flow model. With the aim of quantifying the uncertainty of these components, the Valle Umbra porous aquifer (Central Italy) has been considered as a case study. This aquifer, that is exploited for human consumption and irrigation, is mainly fed by the effective infiltration from the ground surface and partly by the inflow from the carbonate aquifers bordering the valley. A numerical groundwater flow model has been developed through the finite difference MODFLOW2005 code and it has been calibrated and validated considering the recharge regime computed through a Thornthwaite-Mather infiltration model under the climate conditions observed in the period 1956-2012. Future scenarios (2010-2070) of temperature and precipitation have been obtained from three different GMCs: ECHAM-5 (Max Planck Institute, Germany), PCM (National Centre Atmospheric Research) and CCSM3 (National Centre Atmospheric Research). Each scenario has been downscaled (DSC) to the data of temperature and precipitation collected in the baseline period 1960-1990 at the stations located in the study area through two different statistical techniques (linear rescaling and quantile mapping). Then, stochastic rainfall and temperature time series are generated through the Neyman-Scott Rectangular Pulses model (NSRP) for precipitation and the Fractionally Differenced ARIMA model (FARIMA

  10. A generalized Kruskal-Wallis test incorporating group uncertainty with application to genetic association studies.

    Acar, Elif F; Sun, Lei

    2013-06-01

    Motivated by genetic association studies of SNPs with genotype uncertainty, we propose a generalization of the Kruskal-Wallis test that incorporates group uncertainty when comparing k samples. The extended test statistic is based on probability-weighted rank-sums and follows an asymptotic chi-square distribution with k - 1 degrees of freedom under the null hypothesis. Simulation studies confirm the validity and robustness of the proposed test in finite samples. Application to a genome-wide association study of type 1 diabetic complications further demonstrates the utilities of this generalized Kruskal-Wallis test for studies with group uncertainty. The method has been implemented as an open-resource R program, GKW. © 2013, The International Biometric Society.

  11. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    Eichstädt, S; Wilkens, V

    2016-01-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work. (paper)

  12. Evaluation of uncertainties in X radiation metrologic chain in the Secondary Standard Dosimetry Laboratory/IRD-Brazilian CNEN

    Fonseca Coelho, B.C. da.

    1987-01-01

    The equipment to measure ionizing radiation used in medicine needs appropriate technical qualifications to comply with their purposes and regular calibrations to assure the correct evaluation of associated quantities. By legal requirements, the annual calibration of users' dosemeters is to be done in a Secondary Standard Dosimetry Laboratory (SSDL), andthe SSDL'S standard dosemeters are refered to a Primary Standard Dosimetry (PSDL), establishing a rigourous metrological network. The SSDL network. The SSDL needs to maintain, regularly, a quality control program for short and Long term stability of standard dosemeters. The purpose of the work was to determine the uncertainties associated to technical procedures of X-rays calibration at the SSDL/IRD/IRD. To evaluate the influence of the nine main parameters that can give origin to uncertainties, specific procedures and methods are established, according to international requirements and recomendations. The methods are based on the comparison of the behaviour of the users' dosemeters, with a standard dosemeter in the many measuring conditions set up for the secondary standard used as a reference. The total uncertainty obtained was 1,81% usig a conservative procedure, to protect the users and patients. When needed to transfer the calibration factor and their uncertainty, the procedure used was to determine the uncertainty under the worsst possible operating conditions of the equipment, to obtain a superestimated value. This represents an excellent result for an SDDL of IAEA Network. (autor) [pt

  13. Uncertainties in Early-Stage Capital Cost Estimation of Process Design – A Case Study on Biorefinery Design

    Cheali, Peam; Gernaey, Krist V.; Sin, Gürkan

    2015-01-01

    Capital investment, next to the product demand, sales, and production costs, is one of the key metrics commonly used for project evaluation and feasibility assessment. Estimating the investment costs of a new product/process alternative during early-stage design is a challenging task, which is especially relevant in biorefinery research where information about new technologies and experience with new technologies is limited. A systematic methodology for uncertainty analysis of cost data is proposed that employs: (a) bootstrapping as a regression method when cost data are available; and, (b) the Monte Carlo technique as an error propagation method based on expert input when cost data are not available. Four well-known models for early-stage cost estimation are reviewed and analyzed using the methodology. The significance of uncertainties of cost data for early-stage process design is highlighted using the synthesis and design of a biorefinery as a case study. The impact of uncertainties in cost estimation on the identification of optimal processing paths is indeed found to be profound. To tackle this challenge, a comprehensive techno-economic risk analysis framework is presented to enable robust decision-making under uncertainties. One of the results using order-of-magnitude estimates shows that the production of diethyl ether and 1,3-butadiene are the most promising with the lowest economic risks (among the alternatives considered) of 0.24 MM$/a and 4.6 MM$/a, respectively.

  14. Uncertainties in Early-Stage Capital Cost Estimation of Process Design – A Case Study on Biorefinery Design

    Cheali, Peam; Gernaey, Krist V.; Sin, Gürkan, E-mail: gsi@kt.dtu.dk [Department of Chemical and Biochemical Engineering, Technical University of Denmark, Lyngby (Denmark)

    2015-02-06

    Capital investment, next to the product demand, sales, and production costs, is one of the key metrics commonly used for project evaluation and feasibility assessment. Estimating the investment costs of a new product/process alternative during early-stage design is a challenging task, which is especially relevant in biorefinery research where information about new technologies and experience with new technologies is limited. A systematic methodology for uncertainty analysis of cost data is proposed that employs: (a) bootstrapping as a regression method when cost data are available; and, (b) the Monte Carlo technique as an error propagation method based on expert input when cost data are not available. Four well-known models for early-stage cost estimation are reviewed and analyzed using the methodology. The significance of uncertainties of cost data for early-stage process design is highlighted using the synthesis and design of a biorefinery as a case study. The impact of uncertainties in cost estimation on the identification of optimal processing paths is indeed found to be profound. To tackle this challenge, a comprehensive techno-economic risk analysis framework is presented to enable robust decision-making under uncertainties. One of the results using order-of-magnitude estimates shows that the production of diethyl ether and 1,3-butadiene are the most promising with the lowest economic risks (among the alternatives considered) of 0.24 MM$/a and 4.6 MM$/a, respectively.

  15. Application of a new methodology to evaluate Dnb limits based on statistical propagation of uncertainties

    Machado, Marcio Dornellas

    1998-09-01

    One of the most important thermalhydraulics safety parameters is the DNBR (Departure from Nucleate Boiling Ratio). The current methodology in use at Eletronuclear to determine DNBR is extremely conservative and may result in penalties to the reactor power due to an increase plugging level of steam generator tubes. This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard methodology (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRA IIC/MIT code, modified to Angra 1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which take most unfavorable values in the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author)

  16. Hurdling barriers through market uncertainty: Case studies ininnovative technology adoption

    Payne, Christopher T.; Radspieler Jr., Anthony; Payne, Jack

    2002-08-18

    The crisis atmosphere surrounding electricity availability in California during the summer of 2001 produced two distinct phenomena in commercial energy consumption decision-making: desires to guarantee energy availability while blackouts were still widely anticipated, and desires to avoid or mitigate significant price increases when higher commercial electricity tariffs took effect. The climate of increased consideration of these factors seems to have led, in some cases, to greater willingness on the part of business decision-makers to consider highly innovative technologies. This paper examines three case studies of innovative technology adoption: retrofit of time-and-temperature signs on an office building; installation of fuel cells to supply power, heating, and cooling to the same building; and installation of a gas-fired heat pump at a microbrewery. We examine the decision process that led to adoption of these technologies. In each case, specific constraints had made more conventional energy-efficient technologies inapplicable. We examine how these barriers to technology adoption developed over time, how the California energy decision-making climate combined with the characteristics of these innovative technologies to overcome the barriers, and what the implications of hurdling these barriers are for future energy decisions within the firms.

  17. Estimation of uncertainties in predictions of environmental transfer models: evaluation of methods and application to CHERPAC

    Koch, J.; Peterson, S-R.

    1995-10-01

    Models used to simulate environmental transfer of radionuclides typically include many parameters, the values of which are uncertain. An estimation of the uncertainty associated with the predictions is therefore essential. Difference methods to quantify the uncertainty in the prediction parameter uncertainties are reviewed. A statistical approach using random sampling techniques is recommended for complex models with many uncertain parameters. In this approach, the probability density function of the model output is obtained from multiple realizations of the model according to a multivariate random sample of the different input parameters. Sampling efficiency can be improved by using a stratified scheme (Latin Hypercube Sampling). Sample size can also be restricted when statistical tolerance limits needs to be estimated. Methods to rank parameters according to their contribution to uncertainty in the model prediction are also reviewed. Recommended are measures of sensitivity, correlation and regression coefficients that can be calculated on values of input and output variables generated during the propagation of uncertainties through the model. A parameter uncertainty analysis is performed for the CHERPAC food chain model which estimates subjective confidence limits and intervals on the predictions at a 95% confidence level. A sensitivity analysis is also carried out using partial rank correlation coefficients. This identified and ranks the parameters which are the main contributors to uncertainty in the predictions, thereby guiding further research efforts. (author). 44 refs., 2 tabs., 4 figs

  18. Estimation of uncertainties in predictions of environmental transfer models: evaluation of methods and application to CHERPAC

    Koch, J. [Israel Atomic Energy Commission, Yavne (Israel). Soreq Nuclear Research Center; Peterson, S-R.

    1995-10-01

    Models used to simulate environmental transfer of radionuclides typically include many parameters, the values of which are uncertain. An estimation of the uncertainty associated with the predictions is therefore essential. Difference methods to quantify the uncertainty in the prediction parameter uncertainties are reviewed. A statistical approach using random sampling techniques is recommended for complex models with many uncertain parameters. In this approach, the probability density function of the model output is obtained from multiple realizations of the model according to a multivariate random sample of the different input parameters. Sampling efficiency can be improved by using a stratified scheme (Latin Hypercube Sampling). Sample size can also be restricted when statistical tolerance limits needs to be estimated. Methods to rank parameters according to their contribution to uncertainty in the model prediction are also reviewed. Recommended are measures of sensitivity, correlation and regression coefficients that can be calculated on values of input and output variables generated during the propagation of uncertainties through the model. A parameter uncertainty analysis is performed for the CHERPAC food chain model which estimates subjective confidence limits and intervals on the predictions at a 95% confidence level. A sensitivity analysis is also carried out using partial rank correlation coefficients. This identified and ranks the parameters which are the main contributors to uncertainty in the predictions, thereby guiding further research efforts. (author). 44 refs., 2 tabs., 4 figs.

  19. Radiofrequency Electromagnetic Radiation and Memory Performance: Sources of Uncertainty in Epidemiological Cohort Studies.

    Brzozek, Christopher; Benke, Kurt K; Zeleke, Berihun M; Abramson, Michael J; Benke, Geza

    2018-03-26

    Uncertainty in experimental studies of exposure to radiation from mobile phones has in the past only been framed within the context of statistical variability. It is now becoming more apparent to researchers that epistemic or reducible uncertainties can also affect the total error in results. These uncertainties are derived from a wide range of sources including human error, such as data transcription, model structure, measurement and linguistic errors in communication. The issue of epistemic uncertainty is reviewed and interpreted in the context of the MoRPhEUS, ExPOSURE and HERMES cohort studies which investigate the effect of radiofrequency electromagnetic radiation from mobile phones on memory performance. Research into this field has found inconsistent results due to limitations from a range of epistemic sources. Potential analytic approaches are suggested based on quantification of epistemic error using Monte Carlo simulation. It is recommended that future studies investigating the relationship between radiofrequency electromagnetic radiation and memory performance pay more attention to treatment of epistemic uncertainties as well as further research into improving exposure assessment. Use of directed acyclic graphs is also encouraged to display the assumed covariate relationship.

  20. Radiofrequency Electromagnetic Radiation and Memory Performance: Sources of Uncertainty in Epidemiological Cohort Studies

    Christopher Brzozek

    2018-03-01

    Full Text Available Uncertainty in experimental studies of exposure to radiation from mobile phones has in the past only been framed within the context of statistical variability. It is now becoming more apparent to researchers that epistemic or reducible uncertainties can also affect the total error in results. These uncertainties are derived from a wide range of sources including human error, such as data transcription, model structure, measurement and linguistic errors in communication. The issue of epistemic uncertainty is reviewed and interpreted in the context of the MoRPhEUS, ExPOSURE and HERMES cohort studies which investigate the effect of radiofrequency electromagnetic radiation from mobile phones on memory performance. Research into this field has found inconsistent results due to limitations from a range of epistemic sources. Potential analytic approaches are suggested based on quantification of epistemic error using Monte Carlo simulation. It is recommended that future studies investigating the relationship between radiofrequency electromagnetic radiation and memory performance pay more attention to treatment of epistemic uncertainties as well as further research into improving exposure assessment. Use of directed acyclic graphs is also encouraged to display the assumed covariate relationship.

  1. Multi-attribute evaluation and choice of alternatives for surplus weapons-usable plutonium disposition at uncertainty

    Kosterev, V.V.; Bolyatko, V.V.; Khajretdinov, S.I.; Averkin, A.N.

    2014-01-01

    The problem of surplus weapons-usable plutonium disposition is formalized as a multi-attribute problem of a choice of alternatives from a set of possible alternatives under fuzzy conditions. Evaluation and ordering of alternatives for the surplus weapons-usable plutonium disposition and sensitivity analysis are carried out at uncertainty [ru

  2. Uncertainty evaluation for three-dimensional scanning electron microscope reconstructions based on the stereo-pair technique

    Carli, Lorenzo; Genta, G; Cantatore, Angela

    2011-01-01

    3D-SEM is a method, based on the stereophotogrammetry technique, which obtains three-dimensional topographic reconstructions starting typically from two SEM images, called the stereo-pair. In this work, a theoretical uncertainty evaluation of the stereo-pair technique, according to GUM (Guide to ...

  3. Constraining the uncertainty in emissions over India with a regional air quality model evaluation

    Karambelas, Alexandra; Holloway, Tracey; Kiesewetter, Gregor; Heyes, Chris

    2018-02-01

    To evaluate uncertainty in the spatial distribution of air emissions over India, we compare satellite and surface observations with simulations from the U.S. Environmental Protection Agency (EPA) Community Multi-Scale Air Quality (CMAQ) model. Seasonally representative simulations were completed for January, April, July, and October 2010 at 36 km × 36 km using anthropogenic emissions from the Greenhouse Gas-Air Pollution Interaction and Synergies (GAINS) model following version 5a of the Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants project (ECLIPSE v5a). We use both tropospheric columns from the Ozone Monitoring Instrument (OMI) and surface observations from the Central Pollution Control Board (CPCB) to closely examine modeled nitrogen dioxide (NO2) biases in urban and rural regions across India. Spatial average evaluation with satellite retrievals indicate a low bias in the modeled tropospheric column (-63.3%), which reflects broad low-biases in majority non-urban regions (-70.1% in rural areas) across the sub-continent to slightly lesser low biases reflected in semi-urban areas (-44.7%), with the threshold between semi-urban and rural defined as 400 people per km2. In contrast, modeled surface NO2 concentrations exhibit a slight high bias of +15.6% when compared to surface CPCB observations predominantly located in urban areas. Conversely, in examining extremely population dense urban regions with more than 5000 people per km2 (dense-urban), we find model overestimates in both the column (+57.8) and at the surface (+131.2%) compared to observations. Based on these results, we find that existing emission fields for India may overestimate urban emissions in densely populated regions and underestimate rural emissions. However, if we rely on model evaluation with predominantly urban surface observations from the CPCB, comparisons reflect model high biases, contradictory to the knowledge gained using satellite observations. Satellites thus

  4. Seismic velocity uncertainties and their effect on geothermal predictions: A case study

    Rabbel, Wolfgang; Köhn, Daniel; Bahadur Motra, Hem; Niederau, Jan; Thorwart, Martin; Wuttke, Frank; Descramble Working Group

    2017-04-01

    Geothermal exploration relies in large parts on geophysical subsurface models derived from seismic reflection profiling. These models are the framework of hydro-geothermal modeling, which further requires estimating thermal and hydraulic parameters to be attributed to the seismic strata. All petrophysical and structural properties involved in this process can be determined only with limited accuracy and thus impose uncertainties onto the resulting model predictions of temperature-depth profiles and hydraulic flow, too. In the present study we analyze sources and effects of uncertainties of the seismic velocity field, which translate directly into depth uncertainties of the hydraulically and thermally relevant horizons. Geological sources of these uncertainties are subsurface heterogeneity and seismic anisotropy, methodical sources are limitations in spread length and physical resolution. We demonstrate these effects using data of the EU-Horizon 2020 project DESCRAMBLE investigating a shallow super-critical geothermal reservoir in the Larderello area. The study is based on 2D- and 3D seismic reflection data and laboratory measurements on representative rock samples under simulated in-situ conditions. The rock samples consistently show P-wave anisotropy values of 10-20% order of magnitude. However, the uncertainty of layer depths induced by anisotropy is likely to be lower depending on the accuracy, with which the spatial orientation of bedding planes can be determined from the seismic reflection images.

  5. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground

  6. Correlation and uncertainties evaluation in backscattering of entrance surface air kerma measurements

    Teixeira, G.J.; Sousa, C.H.S.; Peixoto, J.G.P., E-mail: gt@ird.gov.br [Instituto de Radioproteção e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    The air kerma measurement is important to verify the applied doses in radiodiagnostic. The literature determines some methods to measure the entrance surface air kerma or entrance surface dose but some of this methods may increase the measurement with the backscattering. Were done setups of measurements to do correlations between them. The expanded uncertainty exceeded 5% for measurements with backscattering, reaching 8.36%, while in situations where the backscattering was avoided, the uncertainty was 3.43%. (author)

  7. A new fast algorithm for the evaluation of regions of interest and statistical uncertainty in computed tomography

    Huesman, R.H.

    1984-01-01

    A new algorithm for region of interest evaluation in computed tomography is described. Region of interest evaluation is a technique used to improve quantitation of the tomographic imaging process by summing (or averaging) the reconstructed quantity throughout a volume of particular significance. An important application of this procedure arises in the analysis of dynamic emission computed tomographic data, in which the uptake and clearance of radiotracers are used to determine the blood flow and/or physiologica function of tissue within the significant volume. The new algorithm replaces the conventional technique of repeated image reconstructions with one in which projected regions are convolved and then used to form multiple vector inner products with the raw tomographic data sets. Quantitation of regions of interest is made without the need for reconstruction of tomographic images. The computational advantage of the new algorithm over conventional methods is between factors of 20 and of 500 for typical applications encountered in medical science studies. The greatest benefit is the ease with which the statistical uncertainty of the result is computed. The entire covariance matrix for the evaluation of regions of interest can be calculated with relatively few operations. (author)

  8. Studying the effect of clinical uncertainty on physicians' decision-making using ILIAD.

    Anderson, J D; Jay, S J; Weng, H C; Anderson, M M

    1995-01-01

    The influence of uncertainty on physicians' practice behavior is not well understood. In this research, ILIAD, a diagnostic expert system, has been used to study physicians' responses to uncertainty and how their responses affected clinical performance. The simulation mode of ILIAD was used to standardize the presentation and scoring of two cases to 46 residents in emergency medicine, internal medicine, family practice and transitional medicine at Methodist Hospital of Indiana. A questionnaire was used to collect additional data on how physicians respond to clinical uncertainty. A structural equation model was developed, estimated, and tested. The results indicate that stress that physicians experience in dealing with clinical uncertainty has a negative effect on their clinical performance. Moreover, the way that physicians respond to uncertainty has positive and negative effects on their performance. Open discussions with patients about clinical decisions and the use of practice guidelines improves performance. However, when the physician's clinical decisions are influenced by patient demands or their peers, their performance scores decline.

  9. Validation and evaluation of epistemic uncertainty in rainfall thresholds for regional scale landslide forecasting

    Gariano, Stefano Luigi; Brunetti, Maria Teresa; Iovine, Giulio; Melillo, Massimo; Peruccacci, Silvia; Terranova, Oreste Giuseppe; Vennari, Carmela; Guzzetti, Fausto

    2015-04-01

    Prediction of rainfall-induced landslides can rely on empirical rainfall thresholds. These are obtained from the analysis of past rainfall events that have (or have not) resulted in slope failures. Accurate prediction requires reliable thresholds, which need to be validated before their use in operational landslide warning systems. Despite the clear relevance of validation, only a few studies have addressed the problem, and have proposed and tested robust validation procedures. We propose a validation procedure that allows for the definition of optimal thresholds for early warning purposes. The validation is based on contingency table, skill scores, and receiver operating characteristic (ROC) analysis. To establish the optimal threshold, which maximizes the correct landslide predictions and minimizes the incorrect predictions, we propose an index that results from the linear combination of three weighted skill scores. Selection of the optimal threshold depends on the scope and the operational characteristics of the early warning system. The choice is made by selecting appropriately the weights, and by searching for the optimal (maximum) value of the index. We discuss weakness in the validation procedure caused by the inherent lack of information (epistemic uncertainty) on landslide occurrence typical of large study areas. When working at the regional scale, landslides may have occurred and may have not been reported. This results in biases and variations in the contingencies and the skill scores. We introduce two parameters to represent the unknown proportion of rainfall events (above and below the threshold) for which landslides occurred and went unreported. We show that even a very small underestimation in the number of landslides can result in a significant decrease in the performance of a threshold measured by the skill scores. We show that the variations in the skill scores are different for different uncertainty of events above or below the threshold. This

  10. Evaluating fishery rehabilitation under uncertainty: A bioeconomic analysis of quota management for the Green Bay yellow perch fishery

    Johnson, B.L.; Milliman, S.R.; Bishop, R.C.; Kitchell, J.F.

    1992-01-01

    The fishery for yellow perch Perca flavescens in Green Bay, Lake Michigan, is currently operating under a rehabilitation plan based on a commercial harvest quota. We developed a bioeconomic computer model that included links between population density and growth, recruitment, and fishing effort for this fishery. Random variability was included in the stock-recruitment relation and in a simulated population assessment. We used the model in an adaptive management framework to evaluate the effects of the rehabilitation plan on both commercial and sport fisheries and to search for ways to improve the plan. Results indicate that the current quota policy is a member of a set of policies that would meet most management goals and increase total value of the fishery. Sensitivity analyses indicate that this conclusion is robust over a wide range of biological conditions. We predict that commercial fishers will lose money relative to the baseline condition, but they may receive other benefits from the elimination of the common-property nature of the fishery. The prospect exists for managing variability in harvest and stock size and for maximizing economic returns in the fishery, but more information is required, primarily on sportfishing effort dynamics and angler preferences. Stock-recruitment relations, density dependence of growth, and dynamics of sportfishing effort are the primary sources of uncertainty limiting the precision of our predictions. The current quota policy is about as good as other policies at reducing this uncertainty and appears, overall, to be one of the best choices for this fishery. The analytical techniques used in this study were primarily simple, heuristic approaches that could be easily transferred to other studies.

  11. Comparison and uncertainty evaluation of different calibration protocols and ionization chambers for low-energy surface brachytherapy dosimetry

    Candela-Juan, C., E-mail: ccanjuan@gmail.com [Radiation Oncology Department, La Fe University and Polytechnic Hospital, Valencia 46026 (Spain); Vijande, J. [Department of Atomic, Molecular, and Nuclear Physics, University of Valencia, Burjassot 46100, Spain and Instituto de Física Corpuscular (UV-CSIC), Paterna 46980 (Spain); García-Martínez, T. [Radiation Oncology Department, Hospital La Ribera, Alzira 46600 (Spain); Niatsetski, Y.; Nauta, G.; Schuurman, J. [Elekta Brachytherapy, Veenendaal 3905 TH (Netherlands); Ouhib, Z. [Radiation Oncology Department, Lynn Regional Cancer Center, Boca Raton Community Hospital, Boca Raton, Florida 33486 (United States); Ballester, F. [Department of Atomic, Molecular, and Nuclear Physics, University of Valencia, Burjassot 46100 (Spain); Perez-Calatayud, J. [Radiation Oncology Department, La Fe University and Polytechnic Hospital, Valencia 46026, Spain and Department of Radiotherapy, Clínica Benidorm, Benidorm 03501 (Spain)

    2015-08-15

    Purpose: A surface electronic brachytherapy (EBT) device is in fact an x-ray source collimated with specific applicators. Low-energy (<100 kVp) x-ray beam dosimetry faces several challenges that need to be addressed. A number of calibration protocols have been published for x-ray beam dosimetry. The media in which measurements are performed are the fundamental difference between them. The aim of this study was to evaluate the surface dose rate of a low-energy x-ray source with small field applicators using different calibration standards and different small-volume ionization chambers, comparing the values and uncertainties of each methodology. Methods: The surface dose rate of the EBT unit Esteya (Elekta Brachytherapy, The Netherlands), a 69.5 kVp x-ray source with applicators of 10, 15, 20, 25, and 30 mm diameter, was evaluated using the AAPM TG-61 (based on air kerma) and International Atomic Energy Agency (IAEA) TRS-398 (based on absorbed dose to water) dosimetry protocols for low-energy photon beams. A plane parallel T34013 ionization chamber (PTW Freiburg, Germany) calibrated in terms of both absorbed dose to water and air kerma was used to compare the two dosimetry protocols. Another PTW chamber of the same model was used to evaluate the reproducibility between these chambers. Measurements were also performed with two different Exradin A20 (Standard Imaging, Inc., Middleton, WI) chambers calibrated in terms of air kerma. Results: Differences between surface dose rates measured in air and in water using the T34013 chamber range from 1.6% to 3.3%. No field size dependence has been observed. Differences are below 3.7% when measurements with the A20 and the T34013 chambers calibrated in air are compared. Estimated uncertainty (with coverage factor k = 1) for the T34013 chamber calibrated in water is 2.2%–2.4%, whereas it increases to 2.5% and 2.7% for the A20 and T34013 chambers calibrated in air, respectively. The output factors, measured with the PTW chambers

  12. An enhanced decision support technique under uncertainty to power system design evaluation

    Eskandar, H.; Asgharpoor, M.J.

    2001-10-01

    Multiple attribute decision making (Madam) methods have been widely used in power systems decision problems. This paper presents an enhanced Madam method to help decision makers (DMS) study the influencing factors in the design of power systems. In many Madam problems, however, the information available to the Dm is often imprecise due to the inaccurate measurements and inconsistent priority judgments. The proposed Madam methodology is based on the analytical hierarchy process (Ah) incorporated into the construction procedure of linear additive utility models to quantify the various divergences of opinions, practices and events that lead to confusion and uncertainties in planning. Such practice could help the Dm gain insight into how the imprecise data may affect their choice toward the best solution and how a set of acceptable alternatives may be identified with certain confidence. Sample case study in the design of a hybrid solar-wind power system is provided to illustrate the concepts introduced in this paper. Factors in planning the design of a hybrid solar-wind power system relate mainly to political and social conditions, and to technical advances and economics

  13. Decision making under uncertainty in viticulture: a case study of Port wine

    Ana Paula Lopes

    2013-06-01

    Full Text Available In decision making under uncertainty individual decision makers (winegrowers must choose one of a set number of decision alternatives with ample information about their outcomes but, most of the times, have not enough knowledge or data about the probabilities of the several states of nature. This paper focuses on the classical Maximax, Maximin, Minimax Regret and Realism criteria. The different approaches are analyzed and compared in a case study of Port wine production and selling. The computational involvedness and efficacy of the criterion are also presented. The paper finishes with the results of all observed criteria and alternatives in the circumstances of uncertainty.

  14. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    Baeverstam, U.; Davis, P.; Garcia-Olivares, A.; Henrich, E.; Koch, J.

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations

  15. Guidelines for uncertainty analysis developed for the participants in the BIOMOVS II study

    Baeverstam, U; Davis, P; Garcia-Olivares, A; Henrich, E; Koch, J

    1993-07-01

    This report has been produced to provide guidelines for uncertainty analysis for use by participants in the BIOMOVS II study. It is hoped that others with an interest in modelling contamination in the biosphere will also find it useful. The report has been prepared by members of the Uncertainty and Validation Working Group and has been reviewed by other BIOMOVS II participants. The opinions expressed are those of the authors and should not be taken to represent the views of the BIOMOVS II sponsors or other BIOMOVS Il participating organisations.

  16. Evaluating the capabilities and uncertainties of droplet measurements for the fog droplet spectrometer (FM-100

    J. K. Spiegel

    2012-09-01

    Full Text Available Droplet size spectra measurements are crucial to obtain a quantitative microphysical description of clouds and fog. However, cloud droplet size measurements are subject to various uncertainties. This work focuses on the error analysis of two key measurement uncertainties arising during cloud droplet size measurements with a conventional droplet size spectrometer (FM-100: first, we addressed the precision with which droplets can be sized with the FM-100 on the basis of the Mie theory. We deduced error assumptions and proposed a new method on how to correct measured size distributions for these errors by redistributing the measured droplet size distribution using a stochastic approach. Second, based on a literature study, we summarized corrections for particle losses during sampling with the FM-100. We applied both corrections to cloud droplet size spectra measured at the high alpine site Jungfraujoch for a temperature range from 0 °C to 11 °C. We showed that Mie scattering led to spikes in the droplet size distributions using the default sizing procedure, while the new stochastic approach reproduced the ambient size distribution adequately. A detailed analysis of the FM-100 sampling efficiency revealed that particle losses were typically below 10% for droplet diameters up to 10 μm. For larger droplets, particle losses can increase up to 90% for the largest droplets of 50 μm at ambient wind speeds below 4.4 m s−1 and even to >90% for larger angles between the instrument orientation and the wind vector (sampling angle at higher wind speeds. Comparisons of the FM-100 to other reference instruments revealed that the total liquid water content (LWC measured by the FM-100 was more sensitive to particle losses than to re-sizing based on Mie scattering, while the total number concentration was only marginally influenced by particle losses. Consequently, for further LWC measurements with the FM-100 we strongly recommend to consider (1 the

  17. Uncertainty study of the PWR pressure vessel fluence. Adjustment of the nuclear data base

    Kodeli, I.A.

    1994-01-01

    The code system devoted to the calculation of the sensitivity and uncertainty of of the neutron flux and reaction rates calculated by the transport codes, has been developed. Adjustment of the basic data to experimental results can be performed as well. Various sources of uncertainties can be taken into account, such as those due to the uncertainties in the cross-sections, response functions, fission spectrum and space distribution of neutron source, geometry and material composition uncertainties... One -As well as two- dimensional analysis can be performed. Linear perturbation theory is applied. The code system is sufficiently general to be used for various analysis in the fields of fission and fusion. The principal objective of our studies concerns the capsule dosimetry study realized in the framework of the 900 MWe PWR pressure vessel surveillance program. The analysis indicates that the present calculations, performed by the code TRIPOLI-2, using the ENDF/B-IV based, non-perturbed neutron cross-section library in 315 energy groups, allows to estimate the neutron flux and the reaction rates in the surveillance capsules and in the most calculated and measured reaction rates permits to reduce these uncertainties. The results obtained with the adjusted iron cross-sections, response functions and fission spectrum show that the agreement between the calculation and the experiment was improved to become within 10% approximately. The neutron flux deduced from the experiment is then extrapolated from the capsule to the most exposed pressure vessel location using the calculated lead factor. The uncertainty in this factor was estimated to be about 7%. (author). 39 refs., 52 figs., 30 tabs

  18. Evaluating Variability and Uncertainty of Geological Strength Index at a Specific Site

    Wang, Yu; Aladejare, Adeyemi Emman

    2016-09-01

    Geological Strength Index (GSI) is an important parameter for estimating rock mass properties. GSI can be estimated from quantitative GSI chart, as an alternative to the direct observational method which requires vast geological experience of rock. GSI chart was developed from past observations and engineering experience, with either empiricism or some theoretical simplifications. The GSI chart thereby contains model uncertainty which arises from its development. The presence of such model uncertainty affects the GSI estimated from GSI chart at a specific site; it is, therefore, imperative to quantify and incorporate the model uncertainty during GSI estimation from the GSI chart. A major challenge for quantifying the GSI chart model uncertainty is a lack of the original datasets that have been used to develop the GSI chart, since the GSI chart was developed from past experience without referring to specific datasets. This paper intends to tackle this problem by developing a Bayesian approach for quantifying the model uncertainty in GSI chart when using it to estimate GSI at a specific site. The model uncertainty in the GSI chart and the inherent spatial variability in GSI are modeled explicitly in the Bayesian approach. The Bayesian approach generates equivalent samples of GSI from the integrated knowledge of GSI chart, prior knowledge and observation data available from site investigation. Equations are derived for the Bayesian approach, and the proposed approach is illustrated using data from a drill and blast tunnel project. The proposed approach effectively tackles the problem of how to quantify the model uncertainty that arises from using GSI chart for characterization of site-specific GSI in a transparent manner.

  19. Sensitivity and uncertainty evaluation applied to the failure process of nuclear fuel

    Gomes, Daniel S.

    2017-01-01

    Nuclear power plants must operate with minimal risk. The nuclear power plants licensing process is based on a paired model, combining probabilistic and deterministic approaches to improve fuel rod performance during both steady state and transient events. In this study, performance fuel codes were used to simulate the test rod IFA-650-4, with a burnup of 92 GWd/MTU within a Halden reactor. In a loss-of-coolant test, the cladding failed within 336 s after reaching a temperature of 800 °C. Nuclear systems work with many imprecise values that must be quantified and propagated. These sources were separated by physical models or boundary conditions describing fuel thermal conductibility, fission gas release, and creep rates. These factors change output responses. Manufacturing tolerances show dimensional variations for fuel rods, and boundary conditions within the system are characterized using small ranges that can spread throughout the system. To identify the input parameters that produce output effects, we used Pearson coefficients between input and output. These input values represent uncertainties using a stochastic technique that can define the effect of input parameters on the establishment of realistic safety limits. Random sampling provided a set of runs for independent variables proposed by Wilks' formulation. The number of samples required to achieve the 95 th percentile, with 95% confidence, depending on verifying the confidence interval to each output. The FRAPTRAN code utilized a module to reproduce the plastic response, defining the failure limit of the fuel rod. (author)

  20. Sensitivity and uncertainty evaluation applied to the failure process of nuclear fuel

    Gomes, Daniel S., E-mail: dsgomes@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Nuclear power plants must operate with minimal risk. The nuclear power plants licensing process is based on a paired model, combining probabilistic and deterministic approaches to improve fuel rod performance during both steady state and transient events. In this study, performance fuel codes were used to simulate the test rod IFA-650-4, with a burnup of 92 GWd/MTU within a Halden reactor. In a loss-of-coolant test, the cladding failed within 336 s after reaching a temperature of 800 °C. Nuclear systems work with many imprecise values that must be quantified and propagated. These sources were separated by physical models or boundary conditions describing fuel thermal conductibility, fission gas release, and creep rates. These factors change output responses. Manufacturing tolerances show dimensional variations for fuel rods, and boundary conditions within the system are characterized using small ranges that can spread throughout the system. To identify the input parameters that produce output effects, we used Pearson coefficients between input and output. These input values represent uncertainties using a stochastic technique that can define the effect of input parameters on the establishment of realistic safety limits. Random sampling provided a set of runs for independent variables proposed by Wilks' formulation. The number of samples required to achieve the 95{sup th} percentile, with 95% confidence, depending on verifying the confidence interval to each output. The FRAPTRAN code utilized a module to reproduce the plastic response, defining the failure limit of the fuel rod. (author)

  1. Reader reaction on the generalized Kruskal-Wallis test for genetic association studies incorporating group uncertainty.

    Wu, Baolin; Guan, Weihua

    2015-06-01

    Acar and Sun (2013, Biometrics 69, 427-435) presented a generalized Kruskal-Wallis (GKW) test for genetic association studies that incorporated the genotype uncertainty and showed its robust and competitive performance compared to existing methods. We present another interesting way to derive the GKW test via a rank linear model. © 2014, The International Biometric Society.

  2. Reader Reaction On the generalized Kruskal-Wallis test for genetic association studies incorporating group uncertainty

    Wu, Baolin; Guan, Weihua

    2014-01-01

    Acar and Sun (2013, Biometrics, 69, 427-435) presented a generalized Kruskal-Wallis (GKW) test for genetic association studies that incorporated the genotype uncertainty and showed its robust and competitive performance compared to existing methods. We present another interesting way to derive the GKW test via a rank linear model.

  3. Uncertainty Evaluation of the New Setup for Measurement of Water-Vapor Permeation Rate by a Dew-Point Sensor

    Hudoklin, D.; Šetina, J.; Drnovšek, J.

    2012-09-01

    The measurement of the water-vapor permeation rate (WVPR) through materials is very important in many industrial applications such as the development of new fabrics and construction materials, in the semiconductor industry, packaging, vacuum techniques, etc. The demand for this kind of measurement grows considerably and thus many different methods for measuring the WVPR are developed and standardized within numerous national and international standards. However, comparison of existing methods shows a low level of mutual agreement. The objective of this paper is to demonstrate the necessary uncertainty evaluation for WVPR measurements, so as to provide a basis for development of a corresponding reference measurement standard. This paper presents a specially developed measurement setup, which employs a precision dew-point sensor for WVPR measurements on specimens of different shapes. The paper also presents a physical model, which tries to account for both dynamic and quasi-static methods, the common types of WVPR measurements referred to in standards and scientific publications. An uncertainty evaluation carried out according to the ISO/IEC guide to the expression of uncertainty in measurement (GUM) shows the relative expanded ( k = 2) uncertainty to be 3.0 % for WVPR of 6.71 mg . h-1 (corresponding to permeance of 30.4 mg . m-2. day-1 . hPa-1).

  4. Quantifying reactor safety margins: Part 1: An overview of the code scaling, applicability, and uncertainty evaluation methodology

    Boyack, B.E.; Duffey, R.B.; Griffith, P.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems (ECCS) entitled ''Emergency Core Cooling System; Revisions to Acceptance Criteria.'' The revised rule states an alternate ECCS performance analysis, based on best-estimate methods, may be used to provide more realistic estimates of plant safety margins, provided the licensee quantifies the uncertainty of the estimates and included that uncertainty when comparing the calculated results with prescribed acceptance limits. To support the revised ECCS rule, the NRC and its contractors and consultants have developed and demonstrated a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. It is an auditable, traceable, and practical method for combining quantitative analyses and expert opinions to arrive at computed values of uncertainty. This paper provides an overview of the CSAU evaluation methodology and its application to a postulated cold-leg, large-break loss-of-coolant accident in a Westinghouse four-loop pressurized water reactor with 17 /times/ 17 fuel. The code selected for this demonstration of the CSAU methodology was TRAC-PF1/MOD1, Version 14.3. 23 refs., 5 figs., 1 tab

  5. Uncertainty analysis of daily potable water demand on the performance evaluation of rainwater harvesting systems in residential buildings.

    Silva, Arthur Santos; Ghisi, Enedir

    2016-09-15

    The objective of this paper is to perform a sensitivity analysis of design variables and an uncertainty analysis of daily potable water demand to evaluate the performance of rainwater harvesting systems in residential buildings. Eight cities in Brazil with different rainfall patterns were analysed. A numeric experiment was performed by means of computer simulation of rainwater harvesting. A sensitivity analysis was performed using variance-based indices for identifying the most important design parameters for rainwater harvesting systems when assessing the potential for potable water savings and underground tank capacity sizing. The uncertainty analysis was performed for different scenarios of potable water demand with stochastic variations in a normal distribution with different coefficients of variation throughout the simulated period. The results have shown that different design variables, such as potable water demand, number of occupants, rainwater demand, and roof area are important for obtaining the ideal underground tank capacity and estimating the potential for potable water savings. The stochastic variations on the potable water demand caused amplitudes of up to 4.8% on the potential for potable water savings and 9.4% on the ideal underground tank capacity. Average amplitudes were quite low for all cities. However, some combinations of parameters resulted in large amplitude of uncertainty and difference from uniform distribution for tank capacities and potential for potable water savings. Stochastic potable water demand generated low uncertainties in the performance evaluation of rainwater harvesting systems; therefore, uniform distribution could be used in computer simulation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. A novel method for the evaluation of uncertainty in dose-volume histogram computation.

    Henríquez, Francisco Cutanda; Castrillón, Silvia Vargas

    2008-03-15

    Dose-volume histograms (DVHs) are a useful tool in state-of-the-art radiotherapy treatment planning, and it is essential to recognize their limitations. Even after a specific dose-calculation model is optimized, dose distributions computed by using treatment-planning systems are affected by several sources of uncertainty, such as algorithm limitations, measurement uncertainty in the data used to model the beam, and residual differences between measured and computed dose. This report presents a novel method to take them into account. To take into account the effect of associated uncertainties, a probabilistic approach using a new kind of histogram, a dose-expected volume histogram, is introduced. The expected value of the volume in the region of interest receiving an absorbed dose equal to or greater than a certain value is found by using the probability distribution of the dose at each point. A rectangular probability distribution is assumed for this point dose, and a formulation that accounts for uncertainties associated with point dose is presented for practical computations. This method is applied to a set of DVHs for different regions of interest, including 6 brain patients, 8 lung patients, 8 pelvis patients, and 6 prostate patients planned for intensity-modulated radiation therapy. Results show a greater effect on planning target volume coverage than in organs at risk. In cases of steep DVH gradients, such as planning target volumes, this new method shows the largest differences with the corresponding DVH; thus, the effect of the uncertainty is larger.

  7. A new evaluation of the uncertainty associated with CDIAC estimates of fossil fuel carbon dioxide emission

    Robert J. Andres

    2014-07-01

    Full Text Available Three uncertainty assessments associated with the global total of carbon dioxide emitted from fossil fuel use and cement production are presented. Each assessment has its own strengths and weaknesses and none give a full uncertainty assessment of the emission estimates. This approach grew out of the lack of independent measurements at the spatial and temporal scales of interest. Issues of dependent and independent data are considered as well as the temporal and spatial relationships of the data. The result is a multifaceted examination of the uncertainty associated with fossil fuel carbon dioxide emission estimates. The three assessments collectively give a range that spans from 1.0 to 13% (2 σ. Greatly simplifying the assessments give a global fossil fuel carbon dioxide uncertainty value of 8.4% (2 σ. In the largest context presented, the determination of fossil fuel emission uncertainty is important for a better understanding of the global carbon cycle and its implications for the physical, economic and political world.

  8. Developing a methodology for the evaluation of results uncertainties in CFD codes; Desarrollo de una Metodologia para la Evaluacion de Incertidumbres en los Resultados de Codigos de CFD

    Munoz-cobo, J. L.; Chiva, S.; Pena, C.; Vela, E.

    2014-07-01

    In this work the development of a methodology is studied to evaluate the uncertainty in the results of CFD codes and is compatible with the VV-20 standard Standard for Verification and Validation in CFD and Heat Transfer {sup ,} developed by the Association of Mechanical Engineers ASME . Similarly, the alternatives are studied for obtaining existing uncertainty in the results to see which is the best choice from the point of view of implementation and time. We have developed two methods for calculating uncertainty of the results of a CFD code, the first method based on the use of techniques of Monte-Carlo for the propagation of uncertainty in this first method we think it is preferable to use the statistics of the order to determine the number of cases to execute the code, because this way we can always determine the confidence interval desired level of output quantities. The second type of method we have developed is based on non-intrusive polynomial chaos. (Author)

  9. Uncertainty, the Overbearing Lived Experience of the Elderly People Undergoing Hemodialysis: A Qualitative Study.

    Sahaf, Robab; Sadat Ilali, Ehteram; Peyrovi, Hamid; Akbari Kamrani, Ahmad Ali; Spahbodi, Fatemeh

    2017-01-01

    The chronic kidney disease is a major health concern. The number of the elderly people with chronic renal failure has increased across the world. Dialysis is an appropriate therapy for the elderly, but it involves certain challenges. The present paper reports uncertainty as part of the elderly experiences of living with hemodialysis. This qualitative study applied Max van Manen interpretative phenomenological analysis to explain and explore experiences of the elderly with hemodialysis. Given the study inclusion criteria, data were collected using in-depth unstructured interviews with nine elderly undergoing hemodialysis, and then analyzed according to Van Manen 6-stage methodological approach. One of the most important findings emerging in the main study was "uncertainty", which can be important and noteworthy, given other aspects of the elderly life (loneliness, despair, comorbidity of diseases, disability, and mental and psychosocial problems). Uncertainty about the future is the most psychological concerns of people undergoing hemodialysis. The results obtained are indicative of the importance of paying attention to a major aspect in the life of the elderly undergoing hemodialysis, uncertainty. A positive outlook can be created in the elderly through education and increased knowledge about the disease, treatment and complications.

  10. Impact of correlations between core configurations for the evaluation of nuclear data uncertainty propagation for reactivity

    Frosio, T.; Bonaccorsi, T.; Blaise, P.

    2017-01-01

    The precise estimation of Pearson correlations, also called 'representativity' coefficients, between core configurations is a fundamental quantity for properly assessing the nuclear data (ND) uncertainties propagation on integral parameters such as k-eff, power distributions, or reactivity coefficients. In this paper, a traditional adjoint method is used to propagate ND uncertainty on reactivity and reactivity coefficients and estimate correlations between different states of the core. We show that neglecting those correlations induces a loss of information in the final uncertainty. We also show that using approximate values of Pearson does not lead to an important error of the model. This calculation is made for reactivity at the beginning of life and can be extended to other parameters during depletion calculations. (authors)

  11. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics

  12. Determination of a PWR key neutron parameters uncertainties and conformity studies applications; Determination des incertitudes liees aux grandeurs neutroniques d'interet des reacteurs a eau pressurisee a plaques combustible et applications aux etudes de conformite

    Bernard, D

    2002-07-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and lifetime. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimised. (author)

  13. Determination of a PWR key neutron parameters uncertainties and conformity studies applications; Determination des incertitudes liees aux grandeurs neutroniques d'interet des reacteurs a eau pressurisee a plaques combustible et applications aux etudes de conformite

    Bernard, D

    2002-07-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and lifetime. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimised. (author)

  14. Use of the Monte Carlo uncertainty combination method in nuclear reactor setpoint evaluation

    Berte, Frank J.

    2004-01-01

    This paper provides an overview of an alternate method for the performance of instrument uncertainty calculation and instrument setpoint determination, when a setpoint analysis requires application of techniques beyond that provided by the widely used 'Root Sum Squares' approach. The paper will address, when the application of the Monte Carlo (MC) method should be considered, application of the MC method when independent and/or dependent uncertainties are involved, and finally interpretation of results obtained. Both single module as well as instrument string sample applications will be presented. (author)

  15. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.

    2012-12-01

    Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed

  16. Evaluation and Uncertainty of a New Method to Detect Suspected Nuclear and WMD Activity: Project Report

    Kurzeja, R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Werth, D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Buckley, R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-09-29

    The Atmospheric Technology Group at SRNL developed a new method to detect signals from Weapons of Mass Destruction (WMD) activities in a time series of chemical measurements at a downwind location. This method was tested with radioxenon measured in Russia and Japan after the 2013 underground test in North Korea. This LDRD calculated the uncertainty in the method with the measured data and also for a case with the signal reduced to 1/10 its measured value. The research showed that the uncertainty in the calculated probability of origin from the NK test site was small enough to confirm the test. The method was also wellbehaved for small signal strengths.

  17. Accelerated uncertainty propagation in two-level probabilistic studies under monotony

    Limbourg, Philipp; Rocquigny, Etienne de; Andrianov, Guennadi

    2010-01-01

    Double-level probabilistic uncertainty models that separate aleatory and epistemic components enjoy significant interest in risk assessment. But the expensive computational costs associated with calculations of rare failure probabilities are still a large obstacle in practice. Computing accurately a risk lower than 10 -3 with 95% epistemic confidence usually requires 10 7 -10 8 runs in a brute-force double Monte Carlo. For single-level probabilistic studies, FORM (First Order Reliability Analysis) is a classical recipe allowing fast approximation of failure probabilities while MRM (Monotonous Reliability Method) recently proved an attractive robust alternative under monotony. This paper extends these methods to double-level probabilistic models through two novel algorithms designed to compute a set of failure probabilities or an aleatory risk level with an epistemic confidence quantile. The first, L2-FORM (level-2 FORM), allows a rapid approximation of the failure probabilities through a combination of FORM with new ideas to use similarity between computations. L2-MRM (level-2 MRM), a quadrature approach, provides 100%-guaranteed error bounds on the results. Experiments on three flood prediction problems showed that both algorithms approximate a set of 500 failure probabilities of 10 -3 -10 -2 or derived 95% epistemic quantiles with a total of only 500-1000 function evaluations, outperforming importance sampling, iterative FORM and regression splines metamodels.

  18. A Study on the Uncertainty of Flow-Induced Vibration in a Cross Flow over Staggered Tubes

    Kim, Ji-Su; Park, Jong-Woon [Dongguk univ, Gyeong Ju (Korea, Republic of); Choi, Hyeon-Kyeong [HanNam University, Daejeon (Korea, Republic of)

    2015-05-15

    Cross-flow in many support columns of very high temperature reactor (VHTR) lower plenum would have FIV issues under high speed flow jetting from the core. For a group of multiple circular cylinders subjected to a cross-flow, three types of potential vibration mechanisms may exist: (1) Vortex-induced vibration (VIV), (2) Fluid-elastic vibration (FEV) and (3) Turbulence-induced vibration (TIV). Kevalahan studied the free vibration of circular cylinders in a tightly packed periodic square inline array of cylinders. Pandey et al. studied the flue gas flow distribution in the Low Temperature Super Heater (LTSH) tube bundles situated in second pass of a utility boiler and the phenomenon of flow induced vibration. Nakamura et al. studied flow instability of cylinder arrays resembling U-bend tubes in steam generators. The FIV evaluation is usually performed with computational fluid dynamic (CFD) analysis to obtain unknown frequency of oscillation of the multiple objects under turbulent flow and thus the uncertainty residing in the turbulence model used should be quantified. In this paper, potential FIV uncertainty arising from the turbulence phenomena are evaluated for a typical cross flow through staggered tube bundles resembling the VHTR lower plenum support columns. Flow induced vibration (FIV) is one of the important mechanical and fatigue issues in nuclear systems. Especially, cross-flow in many support structures of VHTR lower plenum would have FIV issues under highly turbulent jet flows from the core. The results show that the effect of turbulence parameters on FIV is not negligible and the uncertainty is 5 to 10%. Present method can be applied to future FIV evaluations of nuclear systems. More extensive studies on flow induced vibration in a plant scale by using more rigorous computational methods are under way.

  19. Artificial neural network surrogate development of equivalence models for nuclear data uncertainty propagation in scenario studies

    Krivtchik Guillaume

    2017-01-01

    Full Text Available Scenario studies simulate the whole fuel cycle over a period of time, from extraction of natural resources to geological storage. Through the comparison of different reactor fleet evolutions and fuel management options, they constitute a decision-making support. Consequently uncertainty propagation studies, which are necessary to assess the robustness of the studies, are strategic. Among numerous types of physical model in scenario computation that generate uncertainty, the equivalence models, built for calculating fresh fuel enrichment (for instance plutonium content in PWR MOX so as to be representative of nominal fuel behavior, are very important. The equivalence condition is generally formulated in terms of end-of-cycle mean core reactivity. As this results from a physical computation, it is therefore associated with an uncertainty. A state-of-the-art of equivalence models is exposed and discussed. It is shown that the existing equivalent models implemented in scenario codes, such as COSI6, are not suited to uncertainty propagation computation, for the following reasons: (i existing analytical models neglect irradiation, which has a strong impact on the result and its uncertainty; (ii current black-box models are not suited to cross-section perturbations management; and (iii models based on transport and depletion codes are too time-consuming for stochastic uncertainty propagation. A new type of equivalence model based on Artificial Neural Networks (ANN has been developed, constructed with data calculated with neutron transport and depletion codes. The model inputs are the fresh fuel isotopy, the irradiation parameters (burnup, core fractionation, etc., cross-sections perturbations and the equivalence criterion (for instance the core target reactivity in pcm at the end of the irradiation cycle. The model output is the fresh fuel content such that target reactivity is reached at the end of the irradiation cycle. Those models are built and

  20. APROBA-Plus: A probabilistic tool to evaluate and express uncertainty in hazard characterization and exposure assessment of substances.

    Bokkers, Bas G H; Mengelers, Marcel J; Bakker, Martine I; Chiu, Weihsueh A; Slob, Wout

    2017-12-01

    To facilitate the application of probabilistic risk assessment, the WHO released the APROBA tool. This tool applies lognormal uncertainty distributions to the different aspects of the hazard characterization, resulting in a probabilistic health-based guidance value. The current paper describes an extension, APROBA-Plus, which combines the output from the probabilistic hazard characterization with the probabilistic exposure to rapidly characterize risk and its uncertainty. The uncertainty in exposure is graphically compared with the uncertainty in the target human dose, i.e. the dose that complies with the specified protection goals. APROBA-Plus is applied to several case studies, resulting in distinct outcomes and illustrating that APROBA-Plus could serve as a standard extension of routine risk assessments. By visualizing the uncertainties, APROBA-Plus provides a more transparent and informative outcome than the more usual deterministic approaches, so that risk managers can make better informed decisions. For example, APROBA-Plus can help in deciding whether risk-reducing measures are warranted or that a refined risk assessment would first be needed. If the latter, the tool can be used to prioritize possible refinements. APROBA-Plus may also be used to rank substances into different risk categories, based on potential health risks without being compromised by different levels of conservatism that may be associated with point estimates of risk. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Accounting for age uncertainty in growth modeling, the case study of yellowfin tuna (Thunnus albacares of the Indian Ocean.

    Emmanuelle Dortel

    Full Text Available Age estimates, typically determined by counting periodic growth increments in calcified structures of vertebrates, are the basis of population dynamics models used for managing exploited or threatened species. In fisheries research, the use of otolith growth rings as an indicator of fish age has increased considerably in recent decades. However, otolith readings include various sources of uncertainty. Current ageing methods, which converts an average count of rings into age, only provide periodic age estimates in which the range of uncertainty is fully ignored. In this study, we describe a hierarchical model for estimating individual ages from repeated otolith readings. The model was developed within a Bayesian framework to explicitly represent the sources of uncertainty associated with age estimation, to allow for individual variations and to include knowledge on parameters from expertise. The performance of the proposed model was examined through simulations, and then it was coupled to a two-stanza somatic growth model to evaluate the impact of the age estimation method on the age composition of commercial fisheries catches. We illustrate our approach using the sagittal otoliths of yellowfin tuna of the Indian Ocean collected through large-scale mark-recapture experiments. The simulation performance suggested that the ageing error model was able to estimate the ageing biases and provide accurate age estimates, regardless of the age of the fish. Coupled with the growth model, this approach appeared suitable for modeling the growth of Indian Ocean yellowfin and is consistent with findings of previous studies. The simulations showed that the choice of the ageing method can strongly affect growth estimates with subsequent implications for age-structured data used as inputs for population models. Finally, our modeling approach revealed particularly useful to reflect uncertainty around age estimates into the process of growth estimation and it can

  2. Needs of the CSAU uncertainty method

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  3. Evaluation of Fatigue Crack Propagation of Gears Considering Uncertainties in Loading and Material Properties

    Haileyesus B. Endeshaw

    2017-11-01

    Full Text Available Failure prediction of wind turbine gearboxes (WTGs is especially important since the maintenance of these components is not only costly but also causes the longest downtime. One of the most common causes of the premature fault of WTGs is attributed to the fatigue fracture of gear teeth due to fluctuating and cyclic torque, resulting from stochastic wind loading, transmitted to the gearbox. Moreover, the fluctuation of the torque, as well as the inherent uncertainties of the material properties, results in uncertain life prediction for WTGs. It is therefore essential to quantify these uncertainties in the life estimation of gears. In this paper, a framework, constituted by a dynamic model of a one-stage gearbox, a finite element method, and a degradation model for the estimation of fatigue crack propagation in gear, is presented. Torque time history data of a wind turbine rotor was scaled and used to simulate the stochastic characteristic of the loading and uncertainties in the material constants of the degradation model were also quantified. It was demonstrated that uncertainty quantification of load and material constants provides a reasonable estimation of the distribution of the crack length in the gear tooth at any time step.

  4. Evaluation and Quantification of Uncertainty in the Modeling of Contaminant Transport and Exposure Assessment at a Radioactive Waste Disposal Site

    Tauxe, J.; Black, P.; Carilli, J.; Catlett, K.; Crowe, B.; Hooten, M.; Rawlinson, S.; Schuh, A.; Stockton, T.; Yucel, V.

    2002-12-01

    The disposal of low-level radioactive waste (LLW) in the United States (U.S.) is a highly regulated undertaking. The U.S. Department of Energy (DOE), itself a large generator of such wastes, requires a substantial amount of analysis and assessment before permitting disposal of LLW at its facilities. One of the requirements that must be met in assessing the performance of a disposal site and technology is that a Performance Assessment (PA) demonstrate "reasonable expectation" that certain performance objectives, such as dose to a hypothetical future receptor, not be exceeded. The phrase "reasonable expectation" implies recognition of uncertainty in the assessment process. In order for this uncertainty to be quantified and communicated to decision makers, the PA computer model must accept probabilistic (uncertain) input (parameter values) and produce results which reflect that uncertainty as it is propagated through the model calculations. The GoldSim modeling software was selected for the task due to its unique facility with both probabilistic analysis and radioactive contaminant transport. Probabilistic model parameters range from water content and other physical properties of alluvium to the activity of radionuclides disposed to the amount of time a future resident might be expected to spend tending a garden. Although these parameters govern processes which are defined in isolation as rather simple differential equations, the complex interaction of couple processes makes for a highly nonlinear system with often unanticipated results. The decision maker has the difficult job of evaluating the uncertainty of modeling results in the context of granting permission for LLW disposal. This job also involves the evaluation of alternatives, such as the selection of disposal technologies. Various scenarios can be evaluated in the model, so that the effects of, for example, using a thicker soil cap over the waste cell can be assessed. This ability to evaluate mitigation

  5. Deep Uncertainty Surrounding Coastal Flood Risk Projections: A Case Study for New Orleans

    Wong, Tony E.; Keller, Klaus

    2017-10-01

    Future sea-level rise drives severe risks for many coastal communities. Strategies to manage these risks hinge on a sound characterization of the uncertainties. For example, recent studies suggest that large fractions of the Antarctic ice sheet (AIS) may rapidly disintegrate in response to rising global temperatures, leading to potentially several meters of sea-level rise during the next few centuries. It is deeply uncertain, for example, whether such an AIS disintegration will be triggered, how much this would increase sea-level rise, whether extreme storm surges intensify in a warming climate, or which emissions pathway future societies will choose. Here, we assess the impacts of these deep uncertainties on projected flooding probabilities for a levee ring in New Orleans, LA. We use 18 scenarios, presenting probabilistic projections within each one, to sample key deeply uncertain future projections of sea-level rise, radiative forcing pathways, storm surge characterization, and contributions from rapid AIS mass loss. The implications of these deep uncertainties for projected flood risk are thus characterized by a set of 18 probability distribution functions. We use a global sensitivity analysis to assess which mechanisms contribute to uncertainty in projected flood risk over the course of a 50-year design life. In line with previous work, we find that the uncertain storm surge drives the most substantial risk, followed by general AIS dynamics, in our simple model for future flood risk for New Orleans.

  6. Model uncertainty of various settlement estimation methods in shallow tunnels excavation; case study: Qom subway tunnel

    Khademian, Amir; Abdollahipour, Hamed; Bagherpour, Raheb; Faramarzi, Lohrasb

    2017-10-01

    In addition to the numerous planning and executive challenges, underground excavation in urban areas is always followed by certain destructive effects especially on the ground surface; ground settlement is the most important of these effects for which estimation there exist different empirical, analytical and numerical methods. Since geotechnical models are associated with considerable model uncertainty, this study characterized the model uncertainty of settlement estimation models through a systematic comparison between model predictions and past performance data derived from instrumentation. To do so, the amount of surface settlement induced by excavation of the Qom subway tunnel was estimated via empirical (Peck), analytical (Loganathan and Poulos) and numerical (FDM) methods; the resulting maximum settlement value of each model were 1.86, 2.02 and 1.52 cm, respectively. The comparison of these predicted amounts with the actual data from instrumentation was employed to specify the uncertainty of each model. The numerical model outcomes, with a relative error of 3.8%, best matched the reality and the analytical method, with a relative error of 27.8%, yielded the highest level of model uncertainty.

  7. Evaluating Uncertainties in Sap Flux Scaled Estimates of Forest Transpiration, Canopy Conductance and Photosynthesis

    Ward, E. J.; Bell, D. M.; Clark, J. S.; Kim, H.; Oren, R.

    2009-12-01

    Thermal dissipation probes (TDPs) are a common method for estimating forest transpiration and canopy conductance from sap flux rates in trees, but their implementation is plagued by uncertainties arising from missing data and variability in the diameter and canopy position of trees, as well as sapwood conductivity within individual trees. Uncertainties in estimates of canopy conductance also translate into uncertainties in carbon assimilation in models such as the Canopy Conductance Constrained Carbon Assimilation (4CA) model that combine physiological and environmental data to estimate photosynthetic rates. We developed a method to propagate these uncertainties in the scaling and imputation of TDP data to estimates of canopy transpiration and conductance using a state-space Jarvis-type conductance model in a hierarchical Bayesian framework. This presentation will focus on the impact of these uncertainties on estimates of water and carbon fluxes using 4CA and data from the Duke Free Air Carbon Enrichment (FACE) project, which incorporates both elevated carbon dioxide and soil nitrogen treatments. We will also address the response of canopy conductance to vapor pressure deficit, incident radiation and soil moisture, as well as the effect of treatment-related stand structure differences in scaling TDP measurements. Preliminary results indicate that in 2006, a year of normal precipitation (1127 mm), canopy transpiration increased in elevated carbon dioxide ~8% on a ground area basis. In 2007, a year with a pronounced drought (800 mm precipitation), this increase was only present in the combined carbon dioxide and fertilization treatment. The seasonal dynamics of water and carbon fluxes will be discussed in detail.

  8. Risk in technical and scientific studies: general introduction to uncertainty management and the concept of risk

    Apostolakis, G.E.

    2004-01-01

    George Apostolakis (MIT) presented an introduction to the concept of risk and uncertainty management and their use in technical and scientific studies. He noted that Quantitative Risk Assessment (QRA) provides support to the overall treatment of a system as an integrated socio-technical system. Specifically, QRA aims to answer the questions: - What can go wrong (e.g., accident sequences or scenarios)? - How likely are these sequences or scenarios? - What are the consequences of these sequences or scenarios? The Quantitative Risk Assessment deals with two major types of uncertainty. An assessment requires a 'model of the world', and this preferably would be a deterministic model based on underlying processes. In practice, there are uncertainties in this model of the world relating to variability or randomness that cannot be accounted for directly in a deterministic model and that may require a probabilistic or aleatory model. Both deterministic and aleatory models of the world have assumptions and parameters, and there are 'state-of-knowledge' or epistemic uncertainties associated with these. Sensitivity studies or eliciting expert opinion can be used to address the uncertainties in assumptions, and the level of confidence in parameter values can be characterised using probability distributions (pdfs). Overall, the distinction between aleatory and epistemic uncertainties is not always clear, and both can be treated mathematically in the same way. Lessons on safety assessments that can be learnt from experience at nuclear power plants are that beliefs about what is important can be wrong if a risk assessment is not performed. Also, precautionary approaches are not always conservative if failure modes are not identified. Nevertheless, it is important to recognize that uncertainties will remain despite a quantitative risk assessment: e.g., is the scenario list complete, are the models accepted as reasonable, and are parameter probability distributions representative of

  9. Incorporating uncertainties into risk assessment with an application to the exploratory studies facilities at Yucca Mountain

    Fathauer, P.M.

    1995-08-01

    A methodology that incorporates variability and reducible sources of uncertainty into the probabilistic and consequence components of risk was developed. The method was applied to the north tunnel of the Exploratory Studies Facility at Yucca Mountain in Nevada. In this assessment, variability and reducible sources of uncertainty were characterized and propagated through the risk assessment models using a Monte Carlo based software package. The results were then manipulated into risk curves at the 5% and 95% confidence levels for both the variability and overall uncertainty analyses, thus distinguishing between variability and reducible sources of uncertainty. In the Yucca Mountain application, the designation of the north tunnel as an item important to public safety, as defined by 10 CFR 60, was determined. Specifically, the annual frequency of a rock fall breaching a waste package causing an off-site dose of 500 mrem (5x10 -3 Sv) was calculated. The annual frequency, taking variability into account, ranged from 1.9x10 -9 per year at the 5% confidence level to 2.5x10 -9 per year at the 95% confidence level. The frequency range after including all uncertainty was 9.5x10 -10 to 1.8x10 -8 per year. The maximum observable frequency, at the 100% confidence level, was 4.9x10 -8 per year. This is below the 10 -6 per year frequency criteria of 10 CFR 60. Therefore, based on this work, the north tunnel does not fall under the items important to public safety designation for the event studied

  10. An evaluation of setup uncertainties for patients treated to pelvic sites

    Hunt, Margie A.; Schultheiss, Timothy E.; Desobry, Gregory E.; Hakki, Morgan; Hanks, Gerald E.

    1995-01-01

    Purpose: Successful delivery of conformal fields requires stringent immobilization and treatment verification, as well as knowledge of the setup reproducibility. The purpose of this study was to compare the three-dimensional distribution of setup variations for patients treated to pelvic sites with electronic portal imaging devices (EPID) and portal film. Methods and Materials: Nine patients with genitourinary and gynecological cancers immobilized with custom casts and treated with a four-field whole-pelvis technique were imaged daily using an EPID and filmed once every five to seven treatments. The three-dimensional translational and rotational setup errors were determined using a technique that relies on anatomical landmarks identified on simulation and treatment images. The distributions of the translational and rotational variations in each dimension as well as the total displacement of the treatment isocenter from the simulation isocenter were determined. Results: Grouped analysis of all patients revealed average unidirectional translational deviations of less than 2 mm and a standard deviation of 5.3 mm. The average total undirected distance between the treatment and simulated isocenters was 8.3 mm with a standard deviation of 5 mm. Individual patient analysis revealed eight of nine patients had statistically significant nonzero mean translational variations (p < 0.05). Translational variations measured with film were an average of 1.4 mm less than those measured with EPID, but this difference was not statistically significant. Conclusion: Translational variations measured in this study are in general agreement with previous studies. The use of the EPID in this study was less intrusive and may have resulted in less additional attention being given each imaging setup. This may explain the slightly larger average translational variations observed with EPID vs. film, and suggests that the use of EPIDs is a superior method for assessing the true extent of setup

  11. Evaluation of simulated corn yields and associated uncertainty in different climate zones of China using Daycent Model

    Fu, A.; Xue, Y.

    2017-12-01

    Corn is one of most important agricultural production in China. Research on the simulation of corn yields and the impacts of climate change and agricultural management practices on corn yields is important in maintaining the stable corn production. After climatic data including daily temperature, precipitation, solar radiation, relative humidity, and wind speed from 1948 to 2010, soil properties, observed corn yields, and farmland management information were collected, corn yields grown in humidity and hot environment (Sichuang province) and cold and dry environment (Hebei province) in China in the past 63 years were simulated by Daycent, and the results was evaluated based on published yield record. The relationship between regional climate change, global warming and corn yield were analyzed, the uncertainties of simulation derived from agricultural management practices by changing fertilization levels, land fertilizer maintenance and tillage methods were reported. The results showed that: (1) Daycent model is capable to simulate corn yields under the different climatic background in China. (2) When studying the relationship between regional climate change and corn yields, it has been found that observed and simulated corn yields increased along with total regional climate change. (3) When studying the relationship between the global warming and corn yields, It was discovered that newly-simulated corn yields after removing the global warming trend of original temperature data were lower than before.

  12. Climate change adaptation under uncertainty in the developing world: A case study of sea level rise in Kiribati

    Donner, S. D.; Webber, S.

    2011-12-01

    Climate change is expected to have the greatest impact in parts of the developing world. At the 2010 meeting of U.N. Framework Convention on Climate Change in Cancun, industrialized countries agreed in principle to provide US$100 billion per year by 2020 to assist the developing world respond to climate change. This "Green Climate Fund" is a critical step towards addressing the challenge of climate change. However, the policy and discourse on supporting adaptation in the developing world remains highly idealized. For example, the efficacy of "no regrets" adaptation efforts or "mainstreaming" adaptation into decision-making are rarely evaluated in the real world. In this presentation, I will discuss the gap between adaptation theory and practice using a multi-year case study of the cultural, social and scientific obstacles to adapting to sea level rise in the Pacific atoll nation of Kiribati. Our field research reveals how scientific and institutional uncertainty can limit international efforts to fund adaptation and lead to spiraling costs. Scientific uncertainty about hyper-local impacts of sea level rise, though irreducible, can at times limit decision-making about adaptation measures, contrary to the notion that "good" decision-making practices can incorporate scientific uncertainty. Efforts to improve institutional capacity must be done carefully, or they risk inadvertently slowing the implementation of adaptation measures and increasing the likelihood of "mal"-adaptation.

  13. Uncertainty studies and risk assessment for CO{sub 2} storage in geological formations

    Walter, Lena Sophie

    2013-07-01

    drinking water aquifers. The uncertainties on all three levels are investigated in three approaches with different focus. The concept can also be applied to CO{sub 2} leakage or hazards related to other technologies in the subsurface such as methane storage or atomic waste disposal. In the second part of this thesis, uncertainty studies for two realistic storage formations (the pilot site Ketzin (Germany) and a realistic storage formation in the North German Basin) are performed to investigate the related uncertainties and to reduce them as much as possible. For the Ketzin site, history matching of the measurement data, is an important task for dynamic modeling and essential for future risk assessment. A systematic approach to fit the data set using inverse modeling is presented in this work. For future risk assessment for realistic sites, e.g. for the Ketzin site, the uncertainty studies and the history matching approach provide important information. Finally, CCS is discussed in the context of risk perception and the possible input of the risk assessment concept presented in this work is discussed. This work is a first attempt to connect the technical risk assessment for CO{sub 2} storage to the social science approach for risk assessment. It is bridging the gap between engineering and social sciences by integrating the technical quantification of risk into the wider context of a comprehensive risk governance model.

  14. Uncertainty studies and risk assessment for CO2 storage in geological formations

    Walter, Lena Sophie

    2013-01-01

    aquifers. The uncertainties on all three levels are investigated in three approaches with different focus. The concept can also be applied to CO 2 leakage or hazards related to other technologies in the subsurface such as methane storage or atomic waste disposal. In the second part of this thesis, uncertainty studies for two realistic storage formations (the pilot site Ketzin (Germany) and a realistic storage formation in the North German Basin) are performed to investigate the related uncertainties and to reduce them as much as possible. For the Ketzin site, history matching of the measurement data, is an important task for dynamic modeling and essential for future risk assessment. A systematic approach to fit the data set using inverse modeling is presented in this work. For future risk assessment for realistic sites, e.g. for the Ketzin site, the uncertainty studies and the history matching approach provide important information. Finally, CCS is discussed in the context of risk perception and the possible input of the risk assessment concept presented in this work is discussed. This work is a first attempt to connect the technical risk assessment for CO 2 storage to the social science approach for risk assessment. It is bridging the gap between engineering and social sciences by integrating the technical quantification of risk into the wider context of a comprehensive risk governance model.

  15. Cultural diversity teaching and issues of uncertainty: the findings of a qualitative study

    Giordano James

    2007-04-01

    Full Text Available Abstract Background There is considerable ambiguity in the subjective dimensions that comprise much of the relational dynamic of the clinical encounter. Comfort with this ambiguity, and recognition of the potential uncertainty of particular domains of medicine (e.g. – cultural factors of illness expression, value bias in diagnoses, etc is an important facet of medical education. This paper begins by defining ambiguity and uncertainty as relevant to clinical practice. Studies have shown differing patterns of students' tolerance for ambiguity and uncertainty that appear to reflect extant attitudinal predispositions toward technology, objectivity, culture, value- and theory-ladeness, and the need for self-examination. This paper reports on those findings specifically related to the theme of uncertainty as relevant to teaching about cultural diversity. Its focus is to identify how and where the theme of certainty arose in the teaching and learning of cultural diversity, what were the attitudes toward this theme and topic, and how these attitudes and responses reflect and inform this area of medical pedagogy. Methods A semi-structured interview was undertaken with 61 stakeholders (including policymakers, diversity teachers, students and users. The data were analysed and themes identified. Results There were diverse views about what the term cultural diversity means and what should constitute the cultural diversity curriculum. There was a need to provide certainty in teaching cultural diversity with diversity teachers feeling under considerable pressure to provide information. Students discomfort with uncertainty was felt to drive cultural diversity teaching towards factual emphasis rather than reflection or taking a patient centred approach. Conclusion Students and faculty may feel that cultural diversity teaching is more about how to avoid professional, medico-legal pitfalls, rather than improving the patient experience or the patient

  16. Cultural diversity teaching and issues of uncertainty: the findings of a qualitative study.

    Dogra, Nisha; Giordano, James; France, Nicholas

    2007-04-26

    There is considerable ambiguity in the subjective dimensions that comprise much of the relational dynamic of the clinical encounter. Comfort with this ambiguity, and recognition of the potential uncertainty of particular domains of medicine (e.g.--cultural factors of illness expression, value bias in diagnoses, etc) is an important facet of medical education. This paper begins by defining ambiguity and uncertainty as relevant to clinical practice. Studies have shown differing patterns of students' tolerance for ambiguity and uncertainty that appear to reflect extant attitudinal predispositions toward technology, objectivity, culture, value- and theory-ladeness, and the need for self-examination. This paper reports on those findings specifically related to the theme of uncertainty as relevant to teaching about cultural diversity. Its focus is to identify how and where the theme of certainty arose in the teaching and learning of cultural diversity, what were the attitudes toward this theme and topic, and how these attitudes and responses reflect and inform this area of medical pedagogy. A semi-structured interview was undertaken with 61 stakeholders (including policymakers, diversity teachers, students and users). The data were analysed and themes identified. There were diverse views about what the term cultural diversity means and what should constitute the cultural diversity curriculum. There was a need to provide certainty in teaching cultural diversity with diversity teachers feeling under considerable pressure to provide information. Students discomfort with uncertainty was felt to drive cultural diversity teaching towards factual emphasis rather than reflection or taking a patient centred approach. Students and faculty may feel that cultural diversity teaching is more about how to avoid professional, medico-legal pitfalls, rather than improving the patient experience or the patient-physician relationship. There may be pressure to imbue cultural diversity issues

  17. The Threat of Uncertainty: Why Using Traditional Approaches for Evaluating Spacecraft Reliability are Insufficient for Future Human Mars Missions

    Stromgren, Chel; Goodliff, Kandyce; Cirillo, William; Owens, Andrew

    2016-01-01

    Through the Evolvable Mars Campaign (EMC) study, the National Aeronautics and Space Administration (NASA) continues to evaluate potential approaches for sending humans beyond low Earth orbit (LEO). A key aspect of these missions is the strategy that is employed to maintain and repair the spacecraft systems, ensuring that they continue to function and support the crew. Long duration missions beyond LEO present unique and severe maintainability challenges due to a variety of factors, including: limited to no opportunities for resupply, the distance from Earth, mass and volume constraints of spacecraft, high sensitivity of transportation element designs to variation in mass, the lack of abort opportunities to Earth, limited hardware heritage information, and the operation of human-rated systems in a radiation environment with little to no experience. The current approach to maintainability, as implemented on ISS, which includes a large number of spares pre-positioned on ISS, a larger supply sitting on Earth waiting to be flown to ISS, and an on demand delivery of logistics from Earth, is not feasible for future deep space human missions. For missions beyond LEO, significant modifications to the maintainability approach will be required.Through the EMC evaluations, several key findings related to the reliability and safety of the Mars spacecraft have been made. The nature of random and induced failures presents significant issues for deep space missions. Because spare parts cannot be flown as needed for Mars missions, all required spares must be flown with the mission or pre-positioned. These spares must cover all anticipated failure modes and provide a level of overall reliability and safety that is satisfactory for human missions. This will require a large amount of mass and volume be dedicated to storage and transport of spares for the mission. Further, there is, and will continue to be, a significant amount of uncertainty regarding failure rates for spacecraft

  18. [Determination of eight pesticide residues in tea by liquid chromatography-tandem mass spectrometry and its uncertainty evaluation].

    Hu, Beizhen; Cai, Haijiang; Song, Weihua

    2012-09-01

    A method was developed for the determination of eight pesticide residues (fipronil, imidacloprid, acetamiprid, buprofezin, triadimefon, triadimenol, profenofos, pyridaben) in tea by liquid chromatography-tandem mass spectrometry. The sample was extracted by accelerated solvent extraction with acetone-dichloromethane (1:1, v/v) as solvent, and the extract was then cleaned-up with a Carb/NH2 solid phase extraction (SPE) column. The separation was performed on a Hypersil Gold C, column (150 mm x 2. 1 mm, 5 microm) and with the gradient elution of acetonitrile and 0. 1% formic acid. The eight pesticides were determined in the modes of electrospray ionization (ESI) and multiple reaction monitoring (MRM). The analytes were quantified by matrix-matched internal standard method for imidacloprid and acetamiprid, by matrix-matched external standard method for the other pesticides. The calibration curves showed good linearity in 1 - 100 microg/L for fipronil, and in 5 -200 microg/L for the other pesticides. The limits of quantification (LOQs, S/N> 10) were 2 p.g/kg for fipronil and 10 microg/kg for the other pesticides. The average recoveries ranged from 75. 5% to 115.0% with the relative standard deviations of 2.7% - 7.7% at the spiked levels of 2, 5, 50 microg/kg for fipronil and 10, 50, 100 microg/kg for the other pesticides. The uncertainty evaluation for the results was carried out according to JJF 1059-1999 "Evaluation and Expression of Uncertainty in Measurement". Items constituting measurement uncertainty involved standard solution, weighing of sample, sample pre-treatment, and the measurement repeatability of the equipment were evaluated. The results showed that the measurement uncertainty is mainly due to sample pre-treatment, standard curves and measurement repeatability of the equipment. The method developed is suitable for the conformation and quantification of the pesticides in tea.

  19. Development of default uncertainties for the value/benefit attributes in the regulatory analysis technical evaluation handbook

    Gallucci, Raymond H.V.

    2016-01-01

    Highlights: • Uncertainties for values/benefits. • Upper bound four times higher than mean. • Distributional histograms. - Abstract: NUREG/BR-0184, Regulatory Analysis Technical Evaluation (RATE) Handbook, was produced in 1997 as an update to the original NUREG/CR-3568, A Handbook for Value-Impact Assessment (1983). Both documents, especially the later RATE Handbook, have been used extensively by the USNRC and its contractors not only for regulatory analyses to support backfit considerations but also for similar applications, such as Severe Accident Management Alternative (SAMA) analyses as part of license renewals. While both provided high-level guidance on the performance of uncertainty analyses for the various value/benefit attributes, detailed quantification was not of prime interest at the times of the Handbooks’ development, defaulting only to best estimates with low and high bounds on these attributes. As the USNRC examines the possibility of updating the RATE Handbook, renewed interest in a more quantitative approach to uncertainty analyses for the attributes has surfaced. As the result of an effort to enhance the RATE Handbook to permit at least default uncertainty analyses for the value/benefit attributes, it has proven feasible to assign default uncertainties in terms of 95th %ile upper bounds (and absolute lower bounds) on the five dominant value/benefit attributes, and their sum, when performing a regulatory analysis via the RATE Handbook. Appropriate default lower bounds of zero (no value/benefit) and an upper bound (95th %ile) that is four times higher than the mean (for individual value/benefit attributes) or three times higher (for their summation) can be recommended. Distributions in the form of histograms on the summed value/benefit attributes are also provided which could be combined, after appropriate scaling and most likely via simulation, with their counterpart(s) from the impact/cost analysis to yield a final distribution on the net

  20. Application of Bayesian geostatistics for evaluation of mass discharge uncertainty at contaminated sites

    Troldborg, Mads; Nowak, Wolfgang; Lange, Ida V.; Santos, Marta C.; Binning, Philip J.; Bjerg, Poul L.

    2012-09-01

    Mass discharge estimates are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Such estimates are, however, rather uncertain as they integrate uncertain spatial distributions of both concentration and groundwater flow. Here a geostatistical simulation method for quantifying the uncertainty of the mass discharge across a multilevel control plane is presented. The method accounts for (1) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, (2) measurement uncertainty, and (3) uncertain source zone and transport parameters. The method generates conditional realizations of the spatial flow and concentration distribution. An analytical macrodispersive transport solution is employed to simulate the mean concentration distribution, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. The method has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is demonstrated on a field site contaminated with chlorinated ethenes. For this site, we show that including a physically meaningful concentration trend and the cosimulation of hydraulic conductivity and hydraulic gradient across the transect helps constrain the mass discharge uncertainty. The number of sampling points required for accurate mass discharge estimation and the relative influence of different data types on mass discharge uncertainty is discussed.

  1. Interlaboratory study of a liquid chromatography method for erythromycin: determination of uncertainty.

    Dehouck, P; Vander Heyden, Y; Smeyers-Verbeke, J; Massart, D L; Marini, R D; Chiap, P; Hubert, Ph; Crommen, J; Van de Wauw, W; De Beer, J; Cox, R; Mathieu, G; Reepmeyer, J C; Voigt, B; Estevenon, O; Nicolas, A; Van Schepdael, A; Adams, E; Hoogmartens, J

    2003-08-22

    Erythromycin is a mixture of macrolide antibiotics produced by Saccharopolyspora erythreas during fermentation. A new method for the analysis of erythromycin by liquid chromatography has previously been developed. It makes use of an Astec C18 polymeric column. After validation in one laboratory, the method was now validated in an interlaboratory study. Validation studies are commonly used to test the fitness of the analytical method prior to its use for routine quality testing. The data derived in the interlaboratory study can be used to make an uncertainty statement as well. The relationship between validation and uncertainty statement is not clear for many analysts and there is a need to show how the existing data, derived during validation, can be used in practice. Eight laboratories participated in this interlaboratory study. The set-up allowed the determination of the repeatability variance, s(2)r and the between-laboratory variance, s(2)L. Combination of s(2)r and s(2)L results in the reproducibility variance s(2)R. It has been shown how these data can be used in future by a single laboratory that wants to make an uncertainty statement concerning the same analysis.

  2. Uncertainty Evaluation and Influence of Gran Size to Determine PAHs in a Contaminated Soil; Influencia del Tamano de Particula de un Suelo Contaminado en las Incertidumbres Asociadas al Metodo de Determinacion de PAHs

    Garcia Alonso, S.; Perez Pastor, R. M.; Escolano Segoviano, O.; Garcia Frutos, F. J.

    2007-07-20

    An evaluation of uncertainty associated to PAH determination in a contaminated soil is presented. The work was focused to measure the influence of grain size on concentration deviations and give a measure of result confidence of PAHs in the gasworks contaminated soil. This study was performed in the frame of the project 'Assessment of natural remediation technologies for PAHs in contaminated soils'(Spanish Plan Nacional I+D+i, CTM 2004-05832-CO2-01). This paper is presented as follows: A brief introduction which describes the main uncertainty contributions associated to chromatographic analysis. Afterwards, a statistic calculation was performed to measure each uncertainty component. Finally, a global uncertainty was calculated and the influence of grain size and distribution of compounds according to volatility was evaluated. (Author) 10 refs.

  3. Uncertainty, the Overbearing Lived Experience of the Elderly People Undergoing Hemodialysis: A Qualitative Study

    Robab Sahaf

    2017-01-01

    Full Text Available Background: The chronic kidney disease is a major health concern. The number of the elderly people with chronic renal failure has increased across the world. Dialysis is an appropriate therapy for the elderly, but it involves certain challenges. The present paper reports uncertainty as part of the elderly experiences of living with hemodialysis. Methods: This qualitative study applied Max van Manen interpretative phenomenological analysis to explain and explore experiences of the elderly with hemodialysis. Given the study inclusion criteria, data were collected using in-depth unstructured interviews with nine elderly undergoing hemodialysis, and then analyzed according to Van Manen 6-stage methodological approach. Results: One of the most important findings emerging in the main study was “uncertainty”, which can be important and noteworthy, given other aspects of the elderly life (loneliness, despair, comorbidity of diseases, disability, and mental and psychosocial problems. Uncertainty about the future is the most psychological concerns of people undergoing hemodialysis. Conclusion: The results obtained are indicative of the importance of paying attention to a major aspect in the life of the elderly undergoing hemodialysis, uncertainty. A positive outlook can be created in the elderly through education and increased knowledge about the disease, treatment and complications.

  4. Uncertainty, the Overbearing Lived Experience of the Elderly People Undergoing Hemodialysis: A Qualitative Study

    Sahaf, Robab; Sadat Ilali, Ehteram; Peyrovi, Hamid; Akbari Kamrani, Ahmad Ali; Spahbodi, Fatemeh

    2017-01-01

    ABSTRACT Background: The chronic kidney disease is a major health concern. The number of the elderly people with chronic renal failure has increased across the world. Dialysis is an appropriate therapy for the elderly, but it involves certain challenges. The present paper reports uncertainty as part of the elderly experiences of living with hemodialysis. Methods: This qualitative study applied Max van Manen interpretative phenomenological analysis to explain and explore experiences of the elderly with hemodialysis. Given the study inclusion criteria, data were collected using in-depth unstructured interviews with nine elderly undergoing hemodialysis, and then analyzed according to Van Manen 6-stage methodological approach. Results: One of the most important findings emerging in the main study was “uncertainty”, which can be important and noteworthy, given other aspects of the elderly life (loneliness, despair, comorbidity of diseases, disability, and mental and psychosocial problems). Uncertainty about the future is the most psychological concerns of people undergoing hemodialysis. Conclusion: The results obtained are indicative of the importance of paying attention to a major aspect in the life of the elderly undergoing hemodialysis, uncertainty. A positive outlook can be created in the elderly through education and increased knowledge about the disease, treatment and complications. PMID:28097174

  5. The Resource Benefits Evaluation Model on Remanufacturing Processes of End-of-Life Construction Machinery under the Uncertainty in Recycling Price

    Qian-wang Deng

    2017-02-01

    Full Text Available In the process of end-of-life construction machinery remanufacturing, the existence of uncertainties in all aspects of the remanufacturing process increase the difficulty and complexity of resource benefits evaluation for them. To quantify the effects of those uncertainty factors, this paper makes a mathematical analysis of the recycling and remanufacturing processes, building a resource benefits evaluation model for the end-of-life construction machinery. The recycling price and the profits of remanufacturers can thereby be obtained with a maximum remanufacturing resource benefit. The study investigates the change regularity of the resource benefits, recycling price, and profits of remanufacturers when the recycling price, quality fluctuation coefficient, demand coefficient, and the reusing ratio of products or parts are varying. In the numerical experiment, we explore the effects of uncertainties on the remanufacturing decisions and the total expected costs. The simulated analysis shows when the quality fluctuation coefficient is approaching to 1, the values of the profits of remanufacturer, the maximal resource benefits and recycling price grade into constants.

  6. Minimizing Uncertainties Impact in Decision Making with an Applicability Study for Economic Power Dispatch

    Wang, Hong [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Shaobu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fan, Rui [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Zhuanfang [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2016-09-30

    This report summaries the work performed under the LDRD project on the preliminary study on knowledge automation, where specific focus has been made on the investigation of the impact of uncertainties of human decision making onto the optimization of the process operation. At first the statistics on signals from the Brain-Computing Interface (BCI) is analyzed so as to obtain the uncertainties characterization of human operators during the decision making phase using the electroencephalogram (EEG) signals. This is then followed by the discussions of an architecture that reveals the equivalence between optimization and closed loop feedback control design, where it has been shown that all the optimization problems can be transferred into the control design problem for closed loop systems. This has led to a “closed loop” framework, where the structure of the decision making is shown to be subjected to both process disturbances and controller’s uncertainties. The latter can well represent the uncertainties or randomness occurred during human decision making phase. As a result, a stochastic optimization problem has been formulated and a novel solution has been proposed using probability density function (PDF) shaping for both the cost function and the constraints using stochastic distribution control concept. A sufficient condition has been derived that guarantees the convergence of the optimal solution and discussions have been made for both the total probabilistic solution and chanced constrained optimization which have been well-studied in optimal power flows (OPF) area. A simple case study has been carried out for the economic dispatch of powers for a grid system when there are distributed energy resources (DERs) in the system, and encouraging results have been obtained showing that a significant savings on the generation cost can be expected.

  7. Dynamic plantwide modeling, uncertainty and sensitivity analysis of a pharmaceutical upstream synthesis: Ibuprofen case study

    Montes, Frederico C. C.; Gernaey, Krist; Sin, Gürkan

    2018-01-01

    A dynamic plantwide model was developed for the synthesis of the Active pharmaceutical Ingredient (API) ibuprofen, following the Hoescht synthesis process. The kinetic parameters, reagents, products and by-products of the different reactions were adapted from literature, and the different process...... operations integrated until the end process, crystallization and isolation of the ibuprofen crystals. The dynamic model simulations were validated against available measurements from literature and then used as enabling tool to analyze the robustness of design space. To this end, sensitivity of the design...... space towards input disturbances and process uncertainties (from physical and model parameters) is studied using Monte Carlo simulations. The results quantify the uncertainty of the quality of product attributes, with particular focus on crystal size distribution and ibuprofen crystalized. The ranking...

  8. Travel itinerary uncertainty and the pre-travel consultation--a pilot study.

    Flaherty, Gerard; Md Nor, Muhammad Najmi

    2016-01-01

    Risk assessment relies on the accuracy of the information provided by the traveller. A questionnaire was administered to 83 consecutive travellers attending a travel medicine clinic. The majority of travellers was uncertain about destinations within countries, transportation or type of accommodation. Most travellers were uncertain if they would be visiting malaria regions. The degree of uncertainty about itinerary potentially impacts on the ability of the travel medicine specialist to perform an adequate risk assessment, select appropriate vaccinations and prescribe malaria prophylaxis. This study reveals high levels of traveller uncertainty about their itinerary which may potentially reduce the effectiveness of their pre-travel consultation. © The Author 2016. Published by Oxford University Press on behalf of International society of travel medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Using the Rosenzweig frustration picture test in the study of coping behaviour in thesituation of uncertainty

    Elena N. L’vova

    2016-03-01

    Full Text Available The paper highlights the relation between the relevance of coping behaviour and increasing interest to phenomena of uncertainty. The reviewing of coping as complicated setting notion including several levels is offered. The relevance of studying conscious and unconscious levels of coping is validated. Using coping questionnaires’ deficit of prognosis validity and the relevancy of using projective methods that are effective and useful in diagnostics of coping’ unconscious components are discussed. Due to the changes in viewing difficult life situations’ range and focusing on subjective perception of difficulties, the frustration situations are reviewed as difficult daily life situations. The Rosenzweig Picture Frustration test could be used for diagnosing coping’ unconscious components that compose meaning set level and coping behaviour basis. The relations among personal characteristics (tolerance/intolerance to uncertainty, noetic orientations, personal anxiety, locus of control and three types and three directions of subjects’ responses in test’ situations were examined, generalized linear models were used. The subjects of the research are 199 teachers from secondary schools of Russian Federation, mean age is 40.6 years old. The results showed significant relations between particular personal characteristics and types and directions of the responses: ego-defense type and tolerance to uncertainty, obstacle-dominance type and personal anxiety, intropunitive direction and personal anxiety, obstacledominance type and noetic orientations. The common discussion of current results and results obtained in previous studies demonstrates potential existence of mediating relations between particular coping strategies and types and directions of subjects’ responses in The Rosenzweig Picture Frustration test.

  10. Living with uncertainty and hope: A qualitative study exploring parents' experiences of living with childhood multiple sclerosis.

    Hinton, Denise; Kirk, Susan

    2017-06-01

    Background There is growing recognition that multiple sclerosis is a possible, albeit uncommon, diagnosis in childhood. However, very little is known about the experiences of families living with childhood multiple sclerosis and this is the first study to explore this in depth. Objective Our objective was to explore the experiences of parents of children with multiple sclerosis. Methods Qualitative in-depth interviews with 31 parents using a grounded theory approach were conducted. Parents were sampled and recruited via health service and voluntary sector organisations in the United Kingdom. Results Parents' accounts of life with childhood multiple sclerosis were dominated by feelings of uncertainty associated with four sources; diagnostic uncertainty, daily uncertainty, interaction uncertainty and future uncertainty. Parents attempted to manage these uncertainties using specific strategies, which could in turn create further uncertainties about their child's illness. However, over time, ongoing uncertainty appeared to give parents hope for their child's future with multiple sclerosis. Conclusion Illness-related uncertainties appear to play a role in generating hope among parents of a child with multiple sclerosis. However, this may lead parents to avoid sources of information and support that threatens their fragile optimism. Professionals need to be sensitive to the role hope plays in supporting parental coping with childhood multiple sclerosis.

  11. The Source Inversion Validation (SIV) Initiative: A Collaborative Study on Uncertainty Quantification in Earthquake Source Inversions

    Mai, P. M.; Schorlemmer, D.; Page, M.

    2012-04-01

    Earthquake source inversions image the spatio-temporal rupture evolution on one or more fault planes using seismic and/or geodetic data. Such studies are critically important for earthquake seismology in general, and for advancing seismic hazard analysis in particular, as they reveal earthquake source complexity and help (i) to investigate earthquake mechanics; (ii) to develop spontaneous dynamic rupture models; (iii) to build models for generating rupture realizations for ground-motion simulations. In applications (i - iii), the underlying finite-fault source models are regarded as "data" (input information), but their uncertainties are essentially unknown. After all, source models are obtained from solving an inherently ill-posed inverse problem to which many a priori assumptions and uncertain observations are applied. The Source Inversion Validation (SIV) project is a collaborative effort to better understand the variability between rupture models for a single earthquake (as manifested in the finite-source rupture model database) and to develop robust uncertainty quantification for earthquake source inversions. The SIV project highlights the need to develop a long-standing and rigorous testing platform to examine the current state-of-the-art in earthquake source inversion, and to develop and test novel source inversion approaches. We will review the current status of the SIV project, and report the findings and conclusions of the recent workshops. We will briefly discuss several source-inversion methods, how they treat uncertainties in data, and assess the posterior model uncertainty. Case studies include initial forward-modeling tests on Green's function calculations, and inversion results for synthetic data from spontaneous dynamic crack-like strike-slip earthquake on steeply dipping fault, embedded in a layered crustal velocity-density structure.

  12. Analysis of coupled model uncertainties in source-to-dose modeling of human exposures to ambient air pollution: A PM 2.5 case study

    Özkaynak, Halûk; Frey, H. Christopher; Burke, Janet; Pinder, Robert W.

    Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into account linkages and feedbacks. The current state-of-practice for such assessments is to exercise emission, meteorology, air quality, exposure, and dose models separately, and to link them together by using the output of one model as input to the subsequent downstream model. Quantification of variability and uncertainty has been an important topic in the exposure assessment community for a number of years. Variability refers to differences in the value of a quantity (e.g., exposure) over time, space, or among individuals. Uncertainty refers to lack of knowledge regarding the true value of a quantity. An emerging challenge is how to quantify variability and uncertainty in integrated assessments over the source-to-dose continuum by considering contributions from individual as well as linked components. For a case study of fine particulate matter (PM 2.5) in North Carolina during July 2002, we characterize variability and uncertainty associated with each of the individual concentration, exposure and dose models that are linked, and use a conceptual framework to quantify and evaluate the implications of coupled model uncertainties. We find that the resulting overall uncertainties due to combined effects of both variability and uncertainty are smaller (usually by a factor of 3-4) than the crudely multiplied model-specific overall uncertainty ratios. Future research will need to examine the impact of potential dependencies among the model components by conducting a truly coupled modeling analysis.

  13. Evaluation of the uncertainties in the TLD radiosurgery postal dose system

    Campos, Luciana Tourinho; Leite, Sandro Passos; Almeida, Carlos Eduardo Veloso de; Magalhães, Luís Alexandre Gonçalves

    2017-01-01

    Radiosurgery is a single-fraction radiation therapy procedure for treating intracranial lesions using a stereotactic apparatus and multiple narrow beams delivered through noncoplanar isocentric arcs. The Radiological Science Laboratory (LCR/UERJ) operates a postal audit programme in SRT and SRS. The purpose of the programme is to verify the target localization accuracy and the dosimetric conditions of the TPS. The programme works in such a way those TLDs are sent to the centre where they are to be irradiated to a certain dose. The aim of the present work is estimate the uncertainties in the process of dose determination, using experimental data. (author)

  14. Evaluation of the uncertainties in the TLD radiosurgery postal dose system

    Campos, Luciana Tourinho; Leite, Sandro Passos; Almeida, Carlos Eduardo Veloso de; Magalhães, Luís Alexandre Gonçalves, E-mail: tc_luciana@yahoo.com.br [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    Radiosurgery is a single-fraction radiation therapy procedure for treating intracranial lesions using a stereotactic apparatus and multiple narrow beams delivered through noncoplanar isocentric arcs. The Radiological Science Laboratory (LCR/UERJ) operates a postal audit programme in SRT and SRS. The purpose of the programme is to verify the target localization accuracy and the dosimetric conditions of the TPS. The programme works in such a way those TLDs are sent to the centre where they are to be irradiated to a certain dose. The aim of the present work is estimate the uncertainties in the process of dose determination, using experimental data. (author)

  15. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John

    and the hydraulic gradient across the control plane and are consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox...... transformed concentration data is used to simulate observed deviations from this mean solution. By combining the flow and concentration realizations, a mass discharge probability distribution is obtained. Tests show that the decoupled approach is both efficient and able to provide accurate uncertainty...

  16. Orographic precipitation at global and regional scales: Observational uncertainty and evaluation of 25-km global model simulations

    Schiemann, Reinhard; Roberts, Charles J.; Bush, Stephanie; Demory, Marie-Estelle; Strachan, Jane; Vidale, Pier Luigi; Mizielinski, Matthew S.; Roberts, Malcolm J.

    2015-04-01

    Precipitation over land exhibits a high degree of variability due to the complex interaction of the precipitation generating atmospheric processes with coastlines, the heterogeneous land surface, and orography. Global general circulation models (GCMs) have traditionally had very limited ability to capture this variability on the mesoscale (here ~50-500 km) due to their low resolution. This has changed with recent investments in resolution and ensembles of multidecadal climate simulations of atmospheric GCMs (AGCMs) with ~25 km grid spacing are becoming increasingly available. Here, we evaluate the mesoscale precipitation distribution in one such set of simulations obtained in the UPSCALE (UK on PrACE - weather-resolving Simulations of Climate for globAL Environmental risk) modelling campaign with the HadGEM-GA3 AGCM. Increased model resolution also poses new challenges to the observational datasets used to evaluate models. Global gridded data products such as those provided by the Global Precipitation Climatology Project (GPCP) are invaluable for assessing large-scale features of the precipitation distribution but may not sufficiently resolve mesoscale structures. In the absence of independent estimates, the intercomparison of different observational datasets may be the only way to get some insight into the uncertainties associated with these observations. Here, we focus on mid-latitude continental regions where observations based on higher-density gauge networks are available in addition to the global data sets: Europe/the Alps, South and East Asia, and the continental US. The ability of GCMs to represent mesoscale variability is of interest in its own right, as climate information on this scale is required by impact studies. An additional motivation for the research proposed here arises from continuing efforts to quantify the components of the global radiation budget and water cycle. Recent estimates based on radiation measurements suggest that the global mean

  17. Resilience as strategy for climate adaptation under uncertainty. Case study on the area outside the dike of Rotterdam

    De Jong, A.

    2008-07-01

    This study has two aims; (1) to obtain insight in the concepts resilience and uncertainty; to gain insight in how a resilience oriented approach deals with uncertainties about the future; and (2) putting the resilience oriented approach into operation in a case: the area outside the dike of Rotterdam, the Netherlands, which is designated for new buildings [nl

  18. General Practitioners' Experiences of, and Responses to, Uncertainty in Prostate Cancer Screening: Insights from a Qualitative Study.

    Kristen Pickles

    Full Text Available Prostate-specific antigen (PSA testing for prostate cancer is controversial. There are unresolved tensions and disagreements amongst experts, and clinical guidelines conflict. This both reflects and generates significant uncertainty about the appropriateness of screening. Little is known about general practitioners' (GPs' perspectives and experiences in relation to PSA testing of asymptomatic men. In this paper we asked the following questions: (1 What are the primary sources of uncertainty as described by GPs in the context of PSA testing? (2 How do GPs experience and respond to different sources of uncertainty?This was a qualitative study that explored general practitioners' current approaches to, and reasoning about, PSA testing of asymptomatic men. We draw on accounts generated from interviews with 69 general practitioners located in Australia (n = 40 and the United Kingdom (n = 29. The interviews were conducted in 2013-2014. Data were analysed using grounded theory methods. Uncertainty in PSA testing was identified as a core issue.Australian GPs reported experiencing substantially more uncertainty than UK GPs. This seemed partly explainable by notable differences in conditions of practice between the two countries. Using Han et al's taxonomy of uncertainty as an initial framework, we first outline the different sources of uncertainty GPs (mostly Australian described encountering in relation to prostate cancer screening and what the uncertainty was about. We then suggest an extension to Han et al's taxonomy based on our analysis of data relating to the varied ways that GPs manage uncertainties in the context of PSA testing. We outline three broad strategies: (1 taking charge of uncertainty; (2 engaging others in managing uncertainty; and (3 transferring the responsibility for reducing or managing some uncertainties to other parties.Our analysis suggests some GPs experienced uncertainties associated with ambiguous guidance and the

  19. Evaluation of the Positional Uncertainty of a Liver Tumor using 4-Dimensional Computed Tomography and Gated Orthogonal Kilovolt Setup Images

    Ju, Sang Gyu; Hong, Chae Seon; Park, Hee Chul; Ahn, Jong Ho; Shin, Eun Hyuk; Shin, Jung Suk; Kim, Jin Sung; Han, Young Yih; Lim, Do Hoon; Choi, Doo Ho

    2010-01-01

    In order to evaluate the positional uncertainty of internal organs during radiation therapy for treatment of liver cancer, we measured differences in inter- and intra-fractional variation of the tumor position and tidal amplitude using 4-dimensional computed radiograph (DCT) images and gated orthogonal setup kilovolt (KV) images taken on every treatment using the on board imaging (OBI) and real time position management (RPM) system. Twenty consecutive patients who underwent 3-dimensional (3D) conformal radiation therapy for treatment of liver cancer participated in this study. All patients received a 4DCT simulation with an RT16 scanner and an RPM system. Lipiodol, which was updated near the target volume after transarterial chemoembolization or diaphragm was chosen as a surrogate for the evaluation of the position difference of internal organs. Two reference orthogonal (anterior and lateral) digital reconstructed radiograph (DRR) images were generated using CT image sets of 0% and 50% into the respiratory phases. The maximum tidal amplitude of the surrogate was measured from 3D conformal treatment planning. After setting the patient up with laser markings on the skin, orthogonal gated setup images at 50% into the respiratory phase were acquired at each treatment session with OBI and registered on reference DRR images by setting each beam center. Online inter-fractional variation was determined with the surrogate. After adjusting the patient setup error, orthogonal setup images at 0% and 50% into the respiratory phases were obtained and tidal amplitude of the surrogate was measured. Measured tidal amplitude was compared with data from 4DCT. For evaluation of intra-fractional variation, an orthogonal gated setup image at 50% into the respiratory phase was promptly acquired after treatment and compared with the same image taken just before treatment. In addition, a statistical analysis for the quantitative evaluation was performed. Medians of inter

  20. Optimization of internal contamination monitoring programmes by studying uncertainties linked to dosimetric assessment

    Davesne, Estelle

    2010-01-01

    To optimise the protection of workers against ionizing radiations, the International Commission on Radiological Protection recommends the use of dose constraint and limits. To verify the compliance of the means of protection with these values when a risk of internal contamination exists, monitoring programmes formed of periodic bioassay measurements are performed. However, uncertainty in the dose evaluation arises from the variability of the activity measurement and from the incomplete knowledge of the exposure conditions. This uncertainty was taken into account by means of classical, Bayesian and possibilist statistics. The developed methodology was applied to the evaluation of the potential exposure during nuclear fuel preparation or mining; and to the analysis of the monitoring programme of workers purifying plutonium in AREVA NC La Hague reprocessing plant. From the measurement decision threshold, the minimum dose detectable (MDD) by the programme with a given confidence level can be calculated through the software OPSCI. It is shown to be a useful support in the optimisation of monitoring programmes when seeking a compromise between their sensitivity and their costs. (author)

  1. Evaluating the reliability of multi-body mechanisms: A method considering the uncertainties of dynamic performance

    Wu, Jianing; Yan, Shaoze; Zuo, Ming J.

    2016-01-01

    Mechanism reliability is defined as the ability of a certain mechanism to maintain output accuracy under specified conditions. Mechanism reliability is generally assessed by the classical direct probability method (DPM) derived from the first order second moment (FOSM) method. The DPM relies strongly on the analytical form of the dynamic solution so it is not applicable to multi-body mechanisms that have only numerical solutions. In this paper, an indirect probability model (IPM) is proposed for mechanism reliability evaluation of multi-body mechanisms. IPM combines the dynamic equation, degradation function and Kaplan–Meier estimator to evaluate mechanism reliability comprehensively. Furthermore, to reduce the amount of computation in practical applications, the IPM is simplified into the indirect probability step model (IPSM). A case study of a crank–slider mechanism with clearance is investigated. Results show that relative errors between the theoretical and experimental results of mechanism reliability are less than 5%, demonstrating the effectiveness of the proposed method. - Highlights: • An indirect probability model (IPM) is proposed for mechanism reliability evaluation. • The dynamic equation, degradation function and Kaplan–Meier estimator are used. • Then the simplified form of indirect probability model is proposed. • The experimental results agree well with the predicted results.

  2. An interdisciplinary approach to volcanic risk reduction under conditions of uncertainty: a case study of Tristan da Cunha

    Hicks, A.; Barclay, J.; Simmons, P.; Loughlin, S.

    2014-07-01

    The uncertainty brought about by intermittent volcanic activity is fairly common at volcanoes worldwide. While better knowledge of any one volcano's behavioural characteristics has the potential to reduce this uncertainty, the subsequent reduction of risk from volcanic threats is only realised if that knowledge is pertinent to stakeholders and effectively communicated to inform good decision making. Success requires integration of methods, skills and expertise across disciplinary boundaries. This research project develops and trials a novel interdisciplinary approach to volcanic risk reduction on the remote volcanic island of Tristan da Cunha (South Atlantic). For the first time, volcanological techniques, probabilistic decision support and social scientific methods were integrated in a single study. New data were produced that (1) established no spatio-temporal pattern to recent volcanic activity; (2) quantified the high degree of scientific uncertainty around future eruptive scenarios; (3) analysed the physical vulnerability of the community as a consequence of their geographical isolation and exposure to volcanic hazards; (4) evaluated social and cultural influences on vulnerability and resilience; and (5) evaluated the effectiveness of a scenario planning approach, both as a method for integrating the different strands of the research and as a way of enabling on-island decision makers to take ownership of risk identification and management, and capacity building within their community. The paper provides empirical evidence of the value of an innovative interdisciplinary framework for reducing volcanic risk. It also provides evidence for the strength that comes from integrating social and physical sciences with the development of effective, tailored engagement and communication strategies in volcanic risk reduction.

  3. Modeling for regional ecosystem sustainable development under uncertainty — A case study of Dongying, China

    Zhang, K., E-mail: zhangkaibetter@126.com; Li, Y.P., E-mail: yongping.li@iseis.org; Huang, G.H., E-mail: gordon.huang@uregina.ca; You, L., E-mail: youli_ncepu@126.com; Jin, S.W., E-mail: jinshuwei2014@126.com

    2015-11-15

    In this study, a superiority–inferiority two-stage stochastic programming (STSP) method is developed for planning regional ecosystem sustainable development. STSP can tackle uncertainties expressed as fuzzy sets and probability distributions; it can be used to analyze various policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. STSP is applied to a real case of planning regional ecosystem sustainable development in the City of Dongying, where ecosystem services valuation approaches are incorporated within the optimization process. Regional ecosystem can provide direct and indirect services and intangible benefits to local economy. Land trading mechanism is introduced for planning the regional ecosystem's sustainable development, where wetlands are buyers who would protect regional ecosystem components and self-organization and maintain its integrity. Results of regional ecosystem activities, land use patterns, and land trading schemes have been obtained. Results reveal that, although large-scale reclamation projects can bring benefits to the local economy development, they can also bring with negative effects to the coastal ecosystem; among all industry activities oil field is the major contributor with a large number of pollutant discharges into local ecosystem. Results also show that uncertainty has an important role in successfully launching such a land trading program and trading scheme can provide more effective manner to sustain the regional ecosystem. The findings can help decision makers to realize the sustainable development of ecological resources in the process of rapid industrialization, as well as the integration of economic and ecological benefits. - Highlights: • Superiority–inferiority two-stage stochastic programming (STSP) method is developed. • STSP can tackle uncertainties expressed as fuzzy sets and probability distributions. • STSP is applied to planning

  4. Modeling for regional ecosystem sustainable development under uncertainty — A case study of Dongying, China

    Zhang, K.; Li, Y.P.; Huang, G.H.; You, L.; Jin, S.W.

    2015-01-01

    In this study, a superiority–inferiority two-stage stochastic programming (STSP) method is developed for planning regional ecosystem sustainable development. STSP can tackle uncertainties expressed as fuzzy sets and probability distributions; it can be used to analyze various policy scenarios that are associated with different levels of economic penalties when the promised targets are violated. STSP is applied to a real case of planning regional ecosystem sustainable development in the City of Dongying, where ecosystem services valuation approaches are incorporated within the optimization process. Regional ecosystem can provide direct and indirect services and intangible benefits to local economy. Land trading mechanism is introduced for planning the regional ecosystem's sustainable development, where wetlands are buyers who would protect regional ecosystem components and self-organization and maintain its integrity. Results of regional ecosystem activities, land use patterns, and land trading schemes have been obtained. Results reveal that, although large-scale reclamation projects can bring benefits to the local economy development, they can also bring with negative effects to the coastal ecosystem; among all industry activities oil field is the major contributor with a large number of pollutant discharges into local ecosystem. Results also show that uncertainty has an important role in successfully launching such a land trading program and trading scheme can provide more effective manner to sustain the regional ecosystem. The findings can help decision makers to realize the sustainable development of ecological resources in the process of rapid industrialization, as well as the integration of economic and ecological benefits. - Highlights: • Superiority–inferiority two-stage stochastic programming (STSP) method is developed. • STSP can tackle uncertainties expressed as fuzzy sets and probability distributions. • STSP is applied to planning

  5. Energy assessment of peri-urban horticulture and its uncertainty: Case study for Bogota, Colombia

    Bojaca, C.R. [Centro de Investigaciones y Asesorias Agroindustriales, Facultad de Ciencias Naturales, Universidad de Bogota Jorge Tadeo Lozano, P.O. Box: 140196, Chia (Colombia); Schrevens, E. [Department of Biosystems, Faculty of Applied Bioscience Engineering, Katholieke Universiteit Leuven, Geo-Institute, Celestijnenlaan 200 E, 3001 Heverlee (Belgium)

    2010-05-15

    Scarce information is available about the energy use pattern of horticultural commodities in general and more specifically for peri-urban horticulture. Peri-urban horticulture in the outskirts of Bogota is an important source of vegetables for Colombia's capital city. Based on detailed follow-ups and periodic field measurements an output-input energy balance was performed with the main objective to study the energy use efficiency of those systems. An uncertainty analysis on the input factors and on the energy equivalents was then applied. Over a measurement period of 18-month, the energy use for coriander, lettuce, radish and spinach was investigated, respectively 12.1, 18.8, 6.6 and 10.7 GJ ha{sup -1} were consumed in these cropping systems. Negative balances were observed for all species exception made for spinach where an output:input ratio of 1.16 was found. The two-way uncertainty analysis showed the highest uncertainty for N-based fertilization while no significant effect was observed for seeds in direct sowing crops. Sustainability of peri-urban horticulture around Bogota is compromised not only because of the city expansion but also due to its inefficient energy use. Technical improvements are required to ensure the environmental subsistence of this important sector for the metropolitan area of the city. (author)

  6. Quantifying uncertainty in Transcranial Magnetic Stimulation - A high resolution simulation study in ICBM space.

    Toschi, Nicola; Keck, Martin E; Welt, Tobias; Guerrisi, Maria

    2012-01-01

    Transcranial Magnetic Stimulation offers enormous potential for noninvasive brain stimulation. While it is known that brain tissue significantly "reshapes" induced field and charge distributions, most modeling investigations to-date have focused on single-subject data with limited generality. Further, the effects of the significant uncertainties which exist in the simulation (i.e. brain conductivity distributions) and stimulation (e.g. coil positioning and orientations) setup have not been quantified. In this study, we construct a high-resolution anisotropic head model in standard ICBM space, which can be used as a population-representative standard for bioelectromagnetic simulations. Further, we employ Monte-Carlo simulations in order to quantify how uncertainties in conductivity values propagate all the way to induced field and currents, demonstrating significant, regionally dependent dispersions in values which are commonly assumed "ground truth". This framework can be leveraged in order to quantify the effect of any type of uncertainty in noninvasive brain stimulation and bears relevance in all applications of TMS, both investigative and therapeutic.

  7. Application of Monte Carlo Method for Evaluation of Uncertainties of ITS-90 by Standard Platinum Resistance Thermometer

    Palenčár, Rudolf; Sopkuliak, Peter; Palenčár, Jakub; Ďuriš, Stanislav; Suroviak, Emil; Halaj, Martin

    2017-06-01

    Evaluation of uncertainties of the temperature measurement by standard platinum resistance thermometer calibrated at the defining fixed points according to ITS-90 is a problem that can be solved in different ways. The paper presents a procedure based on the propagation of distributions using the Monte Carlo method. The procedure employs generation of pseudo-random numbers for the input variables of resistances at the defining fixed points, supposing the multivariate Gaussian distribution for input quantities. This allows taking into account the correlations among resistances at the defining fixed points. Assumption of Gaussian probability density function is acceptable, with respect to the several sources of uncertainties of resistances. In the case of uncorrelated resistances at the defining fixed points, the method is applicable to any probability density function. Validation of the law of propagation of uncertainty using the Monte Carlo method is presented on the example of specific data for 25 Ω standard platinum resistance thermometer in the temperature range from 0 to 660 °C. Using this example, we demonstrate suitability of the method by validation of its results.

  8. Uncertainty evaluation of EnPIs in industrial applications as a key factor in setting improvement actions

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2015-11-01

    A methodology is proposed assuming high-level Energy Performance Indicators (EnPIs) uncertainty as quantitative indicator of the evolution of an Energy Management System (EMS). Motivations leading to the selection of the EnPIs, uncertainty evaluation techniques and criteria supporting decision-making are discussed, in order to plan and pursue reliable measures for energy performance improvement. In this paper, problems, priorities, operative possibilities and reachable improvement limits are examined, starting from the measurement uncertainty assessment. Two different industrial cases are analysed with reference to the following aspects: absence/presence of energy management policy and action plans; responsibility level for the energy issues; employees’ training and motivation in respect of the energy problems; absence/presence of adequate infrastructures for monitoring and sharing of energy information; level of standardization and integration of methods and procedures linked to the energy activities; economic and financial resources for the improvement of energy efficiency. A critic and comparative analysis of the obtained results is realized. The methodology, experimentally validated, allows developing useful considerations for effective, realistic and economically feasible improvement plans, depending on the specific situation. Recursive application of the methodology allows getting reliable and resolved assessment of the EMS status, also in dynamic industrial contexts.

  9. Judgment under uncertainty; a probabilistic evaluation framework for decision-making about sanitation systems in low-income countries.

    Malekpour, Shirin; Langeveld, Jeroen; Letema, Sammy; Clemens, François; van Lier, Jules B

    2013-03-30

    This paper introduces the probabilistic evaluation framework, to enable transparent and objective decision-making in technology selection for sanitation solutions in low-income countries. The probabilistic framework recognizes the often poor quality of the available data for evaluations. Within this framework, the evaluations will be done based on the probabilities that the expected outcomes occur in practice, considering the uncertainties in evaluation parameters. Consequently, the outcome of evaluations will not be single point estimates; but there exists a range of possible outcomes. A first trial application of this framework for evaluation of sanitation options in the Nyalenda settlement in Kisumu, Kenya, showed how the range of values that an evaluation parameter may obtain in practice would influence the evaluation outcomes. In addition, as the probabilistic evaluation requires various site-specific data, sensitivity analysis was performed to determine the influence of each data set quality on the evaluation outcomes. Based on that, data collection activities could be (re)directed, in a trade-off between the required investments in those activities and the resolution of the decisions that are to be made. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Evaluation and uncertainties of global climate models as simulated in East Asia and China

    Zhao, Z.C.

    1994-01-01

    The assessments and uncertainties of the general circulation models (GCMs) as simulated in East Asia and China (15-60 N, 70-140 E) have been investigated by using seven GCMs. Four methods of assessment have been chosen. The variables for the validations for the GCMs include the annual, seasonal and monthly mean temperatures and precipitation. The assessments indicated that: (1) the simulations of seven GCMs for temperature are much better than those for precipitation; (2) the simulations in winter are much better than those in summer; (3) the simulations in eastern parts are much better than those in Western parts for both temperature and precipitation; (4) the best GCM for simulated temperature is the GISS model, and the best GCM for simulated precipitation is the UKMO-H model. The seven GCMs' means for both simulated temperature and precipitation provided good results. The range of uncertainties in East Asia and China due to human activities are presented. The differences between the GCMs for temperature and precipitation before the year 2050 are much smaller than those after the year 2050

  11. Uncertainty evaluation of fluid dynamic models and validation by gamma ray transmission measurements of the catalyst flow in a FCC cold pilot unity

    Teles, Francisco A.S.; Santos, Ebenezer F.; Dantas, Carlos C., E-mail: francisco.teles@ufpe.br [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Centro de Tecnologia e Geociencias. Departamento de Energia Nuclear; Melo, Silvio B., E-mail: sbm@cin.ufpe.br [Universidade Federal de Pernambuco (CIN/UFPE), Recife, PE (Brazil). Centro de Informatica; Santos, Valdemir A. dos, E-mail: vas@unicap.br [Universidade Catolica de Pernambuco (UNICAP), Recife, PE (Brazil). Dept. de Quimica; Lima, Emerson A.O., E-mail: emathematics@gmail.com [Universidade de Pernambuco (POLI/UPE), Recife, PE (Brazil). Escola Politecnica

    2013-07-01

    In this paper, fluid dynamics of Fluid Catalytic Cracking (FCC) process is investigated by means of a Cold Flow Pilot Unit (CFPU) constructed in Plexiglas to visualize operational conditions. Axial and radial catalyst profiles were measured by gamma ray transmission in the riser of the CFPU. Standard uncertainty was evaluated in volumetric solid fraction measurements for several concentrations at a given point of axial profile. Monitoring of the pressure drop in riser shows a good agreement with measured standard uncertainty data. A further evaluation of the combined uncertainty was applied to volumetric solid fraction equation using gamma transmission data. Limit condition of catalyst concentration in riser was defined and simulation with random numbers provided by MATLAB software has tested uncertainty evaluation. The Guide to the expression of Uncertainty in Measurement (GUM) is based on the law of propagation of uncertainty and on the characterization of the quantities measured by means of either a Gaussian distribution or a t-distribution, which allows measurement uncertainty to be delimited by means of a confidence interval. A variety of supplements to GUM are being developed, which will progressively enter into effect. The first of these supplements [3] describes an alternative procedure for the calculation of uncertainties: the Monte Carlo Method (MCM).MCM is an alternative to GUM, since it performs a characterization of the quantities measured based on the random sampling of the probability distribution functions. This paper also explains the basic implementation of the MCM method in MATLAB. (author)

  12. Uncertainty evaluation of fluid dynamic models and validation by gamma ray transmission measurements of the catalyst flow in a FCC cold pilot unity

    Teles, Francisco A.S.; Santos, Ebenezer F.; Dantas, Carlos C.; Melo, Silvio B.; Santos, Valdemir A. dos; Lima, Emerson A.O.

    2013-01-01

    In this paper, fluid dynamics of Fluid Catalytic Cracking (FCC) process is investigated by means of a Cold Flow Pilot Unit (CFPU) constructed in Plexiglas to visualize operational conditions. Axial and radial catalyst profiles were measured by gamma ray transmission in the riser of the CFPU. Standard uncertainty was evaluated in volumetric solid fraction measurements for several concentrations at a given point of axial profile. Monitoring of the pressure drop in riser shows a good agreement with measured standard uncertainty data. A further evaluation of the combined uncertainty was applied to volumetric solid fraction equation using gamma transmission data. Limit condition of catalyst concentration in riser was defined and simulation with random numbers provided by MATLAB software has tested uncertainty evaluation. The Guide to the expression of Uncertainty in Measurement (GUM) is based on the law of propagation of uncertainty and on the characterization of the quantities measured by means of either a Gaussian distribution or a t-distribution, which allows measurement uncertainty to be delimited by means of a confidence interval. A variety of supplements to GUM are being developed, which will progressively enter into effect. The first of these supplements [3] describes an alternative procedure for the calculation of uncertainties: the Monte Carlo Method (MCM).MCM is an alternative to GUM, since it performs a characterization of the quantities measured based on the random sampling of the probability distribution functions. This paper also explains the basic implementation of the MCM method in MATLAB. (author)

  13. Input Uncertainty and its Implications on Parameter Assessment in Hydrologic and Hydroclimatic Modelling Studies

    Chowdhury, S.; Sharma, A.

    2005-12-01

    present. SIMEX is based on theory that the trend in alternate parameters can be extrapolated back to the notional error free zone. We illustrate the utility of SIMEX in a synthetic rainfall-runoff modelling scenario and an application to study the dependence of uncertain distributed sea surface temperature anomalies with an indicator of the El Nino Southern Oscillation, the Southern Oscillation Index (SOI). The errors in rainfall data and its affect is explored using Sacramento rainfall runoff model. The rainfall uncertainty is assumed to be multiplicative and temporally invariant. The model used to relate the sea surface temperature anomalies (SSTA) to the SOI is assumed to be of a linear form. The nature of uncertainty in the SSTA is additive and varies with time. The SIMEX framework allows assessment of the relationship between the error free inputs and response. Cook, J.R., Stefanski, L. A., Simulation-Extrapolation Estimation in Parametric Measurement Error Models, Journal of the American Statistical Association, 89 (428), 1314-1328, 1994.

  14. Evaluating Uncertainty in GHG Emission Scenarios: Mapping IAM Outlooks With an Energy System Phase Space

    Ritchie, W. J.; Dowlatabadi, H.

    2017-12-01

    Climate change modeling relies on projections of future greenhouse gas emissions and other phenomena leading to changes in planetary radiative forcing (RF). Pathways for long-run fossil energy use that map to total forcing outcomes are commonly depicted with integrated assessment models (IAMs). IAMs structure outlooks for 21st-century emissions with various theories for developments in demographics, economics, land-use, energy markets and energy service demands. These concepts are applied to understand global changes in two key factors relevant for scenarios of carbon emissions: total energy use (E) this century and the carbon intensity of that energy (F/E). A simple analytical and graphical approach can also illustrate the full range of outcomes for these variables to determine if IAMs provide sufficient coverage of the uncertainty space for future energy use. In this talk, we present a method for understanding uncertainties relevant to RF scenario components in a phase space. The phase space of a dynamic system represents significant factors as axes to capture the full range of physically possible states. A two-dimensional phase space of E and F/E presents the possible system states that can lead to various levels of total 21st-century carbon emissions. Once defined in this way, a phase space of these energy system coordinates allows for rapid characterization of large IAM scenario sets with machine learning techniques. This phase space method is applied to the levels of RF described by the Representative Concentration Pathways (RCPs). The resulting RCP phase space identifies characteristics of the baseline energy system outlooks provided by IAMs for IPCC Working Group III. We conduct a k-means cluster analysis to distinguish the major features of IAM scenarios for each RCP range. Cluster analysis finds the IAM scenarios in AR5 illustrate RCPs with consistent combinations of energy resources. This suggests IAM scenarios understate uncertainty ranges for future

  15. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty.

    Lash, Timothy L

    2007-11-26

    The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is

  16. Bias analysis applied to Agricultural Health Study publications to estimate non-random sources of uncertainty

    Lash Timothy L

    2007-11-01

    Full Text Available Abstract Background The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. Methods For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. Results The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Conclusion Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a

  17. Evaluating uncertainty in 7Be-based soil erosion estimates: an experimental plot approach

    Blake, Will; Taylor, Alex; Abdelli, Wahid; Gaspar, Leticia; Barri, Bashar Al; Ryken, Nick; Mabit, Lionel

    2014-05-01

    Soil erosion remains a major concern for the international community and there is a growing need to improve the sustainability of agriculture to support future food security. High resolution soil erosion data are a fundamental requirement for underpinning soil conservation and management strategies but representative data on soil erosion rates are difficult to achieve by conventional means without interfering with farming practice and hence compromising the representativeness of results. Fallout radionuclide (FRN) tracer technology offers a solution since FRN tracers are delivered to the soil surface by natural processes and, where irreversible binding can be demonstrated, redistributed in association with soil particles. While much work has demonstrated the potential of short-lived 7Be (half-life 53 days), particularly in quantification of short-term inter-rill erosion, less attention has focussed on sources of uncertainty in derived erosion measurements and sampling strategies to minimise these. This poster outlines and discusses potential sources of uncertainty in 7Be-based soil erosion estimates and the experimental design considerations taken to quantify these in the context of a plot-scale validation experiment. Traditionally, gamma counting statistics have been the main element of uncertainty propagated and reported but recent work has shown that other factors may be more important such as: (i) spatial variability in the relaxation mass depth that describes the shape of the 7Be depth distribution for an uneroded point; (ii) spatial variability in fallout (linked to rainfall patterns and shadowing) over both reference site and plot; (iii) particle size sorting effects; (iv) preferential mobility of fallout over active runoff contributing areas. To explore these aspects in more detail, a plot of 4 x 35 m was ploughed and tilled to create a bare, sloped soil surface at the beginning of winter 2013/2014 in southwest UK. The lower edge of the plot was bounded by

  18. Managing uncertainty in advanced liver disease: a qualitative, multiperspective, serial interview study.

    Kimbell, Barbara; Boyd, Kirsty; Kendall, Marilyn; Iredale, John; Murray, Scott A

    2015-11-19

    To understand the experiences and support needs of people with advanced liver disease and those of their lay and professional carers to inform improvements in the supportive and palliative care of this rapidly growing but currently neglected patient group. Multiperspective, serial interviews. We conducted up to three qualitative in-depth interviews with each patient and lay carer over 12 months and single interviews with case-linked healthcare professionals. Data were analysed using grounded theory techniques. Patients with advanced liver disease of diverse aetiologies recruited from an inpatient hepatology ward, and their lay carers and case-linked healthcare professionals nominated by the patients. Primary and secondary care in South-East Scotland. 37 participants (15 patients, 11 lay and 11 professional carers) completed 51 individual and 13 joint patient-carer interviews. Nine patients died during the study. Uncertainty dominated experiences throughout the course of the illness, across patients' considerable physical, psychological, social and existential needs and affected patients, lay carers and professionals. This related to the nature of the condition, the unpredictability of physical deterioration and prognosis, poor communication and information-sharing, and complexities of care. The pervasive uncertainty also shaped patients' and lay carers' strategies for coping and impeded care planning. While patients' acute medical care was usually well coordinated, their ongoing care lacked structure and focus. Living, dying and caring in advanced liver disease is dominated by pervasive, enduring and universally shared uncertainty. In the face of high levels of multidimensional patient distress, professionals must acknowledge this uncertainty in constructive ways that value its contribution to the person's coping approach. Pervasive uncertainty makes anticipatory care planning in advanced liver disease challenging, but planning 'just in case' is vital to ensure

  19. Assessment and Reduction of Model Parametric Uncertainties: A Case Study with A Distributed Hydrological Model

    Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.

    2017-12-01

    The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40

  20. The sensitivity analysis by adjoint method for the uncertainty evaluation of the CATHARE-2 code

    Barre, F.; de Crecy, A.; Perret, C. [French Atomic Energy Commission (CEA), Grenoble (France)

    1995-09-01

    This paper presents the application of the DASM (Discrete Adjoint Sensitivity Method) to CATHARE 2 thermal-hydraulics code. In a first part, the basis of this method is presented. The mathematical model of the CATHARE 2 code is based on the two fluid six equation model. It is discretized using implicit time discretization and it is relatively easy to implement this method in the code. The DASM is the ASM directly applied to the algebraic system of the discretized code equations which has been demonstrated to be the only solution of the mathematical model. The ASM is an integral part of the new version 1.4 of CATHARE. It acts as a post-processing module. It has been qualified by comparison with the {open_quotes}brute force{close_quotes} technique. In a second part, an application of the DASM in CATHARE 2 is presented. It deals with the determination of the uncertainties of the constitutive relationships, which is a compulsory step for calculating the final uncertainty of a given response. First, the general principles of the method are explained: the constitutive relationship are represented by several parameters and the aim is to calculate the variance-covariance matrix of these parameters. The experimental results of the separate effect tests used to establish the correlation are considered. The variance of the corresponding results calculated by CATHARE are estimated by comparing experiment and calculation. A DASM calculation is carried out to provide the derivatives of the responses. The final covariance matrix is obtained by combination of the variance of the responses and those derivatives. Then, the application of this method to a simple case-the blowdown Canon experiment-is presented. This application has been successfully performed.

  1. The sensitivity analysis by adjoint method for the uncertainty evaluation of the CATHARE-2 code

    Barre, F.; de Crecy, A.; Perret, C.

    1995-01-01

    This paper presents the application of the DASM (Discrete Adjoint Sensitivity Method) to CATHARE 2 thermal-hydraulics code. In a first part, the basis of this method is presented. The mathematical model of the CATHARE 2 code is based on the two fluid six equation model. It is discretized using implicit time discretization and it is relatively easy to implement this method in the code. The DASM is the ASM directly applied to the algebraic system of the discretized code equations which has been demonstrated to be the only solution of the mathematical model. The ASM is an integral part of the new version 1.4 of CATHARE. It acts as a post-processing module. It has been qualified by comparison with the open-quotes brute forceclose quotes technique. In a second part, an application of the DASM in CATHARE 2 is presented. It deals with the determination of the uncertainties of the constitutive relationships, which is a compulsory step for calculating the final uncertainty of a given response. First, the general principles of the method are explained: the constitutive relationship are represented by several parameters and the aim is to calculate the variance-covariance matrix of these parameters. The experimental results of the separate effect tests used to establish the correlation are considered. The variance of the corresponding results calculated by CATHARE are estimated by comparing experiment and calculation. A DASM calculation is carried out to provide the derivatives of the responses. The final covariance matrix is obtained by combination of the variance of the responses and those derivatives. Then, the application of this method to a simple case-the blowdown Canon experiment-is presented. This application has been successfully performed

  2. Evaluation of epidemiological studies

    Breckow, J.

    1995-01-01

    The publication is intended for readers with a professional background in radiation protection who are not experts in the field of epidemiology. The potentials and the limits of epidemiology are shown and concepts and terminology of radioepidemilogic studies as well as epidemiology in general are explained, in order to provide the necessary basis for understanding or performing evaluations of epidemiologic studies. (orig./VHE) [de

  3. Uncertainty management in Real Estate Development: Studying the potential of SCRUM design methodology

    Blokpoel, S.B.; Reymen, Isabelle; Dewulf, Geert P.M.R.; Sariyildiz, S.; Tuncer, B.

    2005-01-01

    Real estate development is all about assessing and controlling risks and uncertainties. Risk management implies making decisions based on quantified risks to execute riskresponse measures. Uncertainties, on the other hand, cannot be quantified and are therefore unpredictable. In literature, much

  4. Uncertainty assessment and comparison of different dose algorithms used to evaluate a two element LiF:Mg,Ti TL personal dosemeter

    Stadtmann, H.; Hranitzky, F.C.

    2008-01-01

    This paper presents the results of an uncertainty assessment and comparison study of different dose algorithms used for evaluating our routine two element TL whole body dosemeter. Due to the photon energy response of the two different filtered LiF:Mg,Ti detector elements the application of dose algorithms is necessary to assess the relevant photon doses over the rated energy range with an acceptable energy response. Three dose algorithms are designed to calculate the dose for the different dose equivalent quantities, i.e. personal dose equivalent H p (10) and H p (0.07) and photon dose equivalent H x used for personal monitoring before introducing personal dose equivalent. Based on experimental results both for free in air calibration as well as calibration on the ISO water slab phantom (type test data) a detailed uncertainty analysis war performed by means of Monte Carlo simulation techniques. The uncertainty contribution of the individual detector element signals was taken into special consideration. (author)

  5. Uncertainty evaluation of the kerma in the air, related to the active volume in the ionization chamber of concentric cylinders, by Monte Carlo simulation

    Lo Bianco, A.S.; Oliveira, H.P.S.; Peixoto, J.G.P.

    2009-01-01

    To implant the primary standard of the magnitude kerma in the air for X-ray between 10 - 50 keV, the National Metrology Laboratory of Ionizing Radiations (LNMRI) must evaluate all the uncertainties of measurement related with Victtoren chamber. So, it was evaluated the uncertainty of the kerma in the air consequent of the inaccuracy in the active volume of the chamber using the calculation of Monte Carlo as a tool through the Penelope software

  6. Evaluating the uncertainties of thermal catalytic conversion in measuring atmospheric nitrogen dioxide at four differently polluted sites in China

    Xu, Zheng; Wang, Tao; Xue, L. K.; Louie, Peter K. K.; Luk, Connie W. Y.; Gao, J.; Wang, S. L.; Chai, F. H.; Wang, W. X.

    2013-09-01

    A widely used method for measuring nitrogen dioxide (NO2) in the atmosphere is the conversion of NO2 to nitric oxide (NO) on the hot surface of a molybdenum oxide (MoO) catalyst followed by the chemiluminescence detection of NO. Although it has long been recognized that this type of conversion may suffer from the positive interference of other oxidized nitrogen compounds, evaluations of such interference in the atmosphere are scarce, thus rendering it difficult to make use of a large portion of the NO2 or NOx data obtained via this method (often denoted as NO2* or NOx*). In the present study, we compared the MoO converter with a selective, more accurate photolytic approach at four differently polluted sites in China. The converter worked well at the urban site, which was greatly affected by fresh emissions, but, on average, overestimated NO2 by 30%-50% at the two suburban sites and by more than 130% at the mountain-top site during afternoon hours, with a much larger positive bias seen during the top 10% of ozone events. The degree of overestimation depended on both air-parcel age and the composition of the oxidation products/intermediates of NOx (NOz). We attempted to derive an empirical formula to correct for this overestimation using concurrently measured O3, NO, and NO2* at the two suburban sites. Although the formula worked well at each individual site, the different NOz partitions at the sites made it difficult to obtain a universal formula. In view of the difficulty of assessing the uncertainties of the conventional conversion method, thus limiting the usability of data obtained via this method in atmospheric research, we suggest that, in areas away from fresh NOx emission sources, either a more selective NO2 measurement method or a NOy (NOx and its reaction products and intermediates) instrument should be adopted.

  7. Calculation of the uncertainty of H{sub P} (10) evaluation for a thermoluminescent dosimetry system; Calculo da incerteza da avaliacao do H{sub P} (10) para um sistema de dosimetria termoluminescente

    Ferreira, M.S.; Silva, E.R.; Mauricio, C.L.P., E-mail: max.das.ferreira@gmail.com [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2016-07-01

    Full interpretation of dose assessment only can be performed when the uncertainty of the measurement is known. The aim of this study is to calculate the uncertainty of the TL dosimetry system of the LDF/IRD for evaluation of H{sub P} (10) for photons. It has been done by experimental measurements, extraction of information from documents and calculation of uncertainties based on ISO GUM. Energy and angular dependence is the most important source to the combined u{sub c}(y) and expanded (U) uncertainty. For 10 mSv, it was obtained u{sub c}(y) = 1,99 mSv and U = 3,98 mSv for 95% of coverage interval. (author)

  8. Assessing the Expected Value of Research Studies in Reducing Uncertainty and Improving Implementation Dynamics.

    Grimm, Sabine E; Dixon, Simon; Stevens, John W

    2017-07-01

    With low implementation of cost-effective health technologies being a problem in many health systems, it is worth considering the potential effects of research on implementation at the time of health technology assessment. Meaningful and realistic implementation estimates must be of dynamic nature. To extend existing methods for assessing the value of research studies in terms of both reduction of uncertainty and improvement in implementation by considering diffusion based on expert beliefs with and without further research conditional on the strength of evidence. We use expected value of sample information and expected value of specific implementation measure concepts accounting for the effects of specific research studies on implementation and the reduction of uncertainty. Diffusion theory and elicitation of expert beliefs about the shape of diffusion curves inform implementation dynamics. We illustrate use of the resulting dynamic expected value of research in a preterm birth screening technology and results are compared with those from a static analysis. Allowing for diffusion based on expert beliefs had a significant impact on the expected value of research in the case study, suggesting that mistakes are made where static implementation levels are assumed. Incorporating the effects of research on implementation resulted in an increase in the expected value of research compared to the expected value of sample information alone. Assessing the expected value of research in reducing uncertainty and improving implementation dynamics has the potential to complement currently used analyses in health technology assessments, especially in recommendations for further research. The combination of expected value of research, diffusion theory, and elicitation described in this article is an important addition to the existing methods of health technology assessment.

  9. Uncertainty and Sensitivity Studies with TRACE-SUSA and TRACE-DAKOTA by Means of Transient BFBT Data

    Wadim Jaeger

    2013-01-01

    Full Text Available In the present paper, an uncertainty and sensitivity study is performed for transient void fraction and pressure drop measurements. Two transients have been selected from the NUPEC BFBT database. The first one is a turbine trip without bypass and the second one is a trip of a recirculation pump. TRACE (version 5.0 patch 2 is used for the thermohydraulic study and SUSA and DAKOTA are used for the quantification of the model uncertainties and the evaluation of the sensitivities. As uncertain parameters geometrical values, hydraulic diameter, and wall roughness are considered while mass flow rate, power, pressure, and inlet subcooling (inlet temperature are chosen as boundary and input conditions. Since these parameters change with time, it is expected that the importance of them on pressure drop and void fraction will change, too. The results show that the pressure drop is mostly sensitive to geometrical variations like the hydraulic diameter and the form loss coefficient of the spacer grid. For low void fractions, the parameter of the highest importance is the inlet temperature/subcooling while at higher void fraction the power is also of importance.

  10. Sources of uncertainty in hydrological climate impact assessment: a cross-scale study

    Hattermann, F. F.; Vetter, T.; Breuer, L.; Su, Buda; Daggupati, P.; Donnelly, C.; Fekete, B.; Flörke, F.; Gosling, S. N.; Hoffmann, P.; Liersch, S.; Masaki, Y.; Motovilov, Y.; Müller, C.; Samaniego, L.; Stacke, T.; Wada, Y.; Yang, T.; Krysnaova, V.

    2018-01-01

    Climate change impacts on water availability and hydrological extremes are major concerns as regards the Sustainable Development Goals. Impacts on hydrology are normally investigated as part of a modelling chain, in which climate projections from multiple climate models are used as inputs to multiple impact models, under different greenhouse gas emissions scenarios, which result in different amounts of global temperature rise. While the goal is generally to investigate the relevance of changes in climate for the water cycle, water resources or hydrological extremes, it is often the case that variations in other components of the model chain obscure the effect of climate scenario variation. This is particularly important when assessing the impacts of relatively lower magnitudes of global warming, such as those associated with the aspirational goals of the Paris Agreement. In our study, we use ANOVA (analyses of variance) to allocate and quantify the main sources of uncertainty in the hydrological impact modelling chain. In turn we determine the statistical significance of different sources of uncertainty. We achieve this by using a set of five climate models and up to 13 hydrological models, for nine large scale river basins across the globe, under four emissions scenarios. The impact variable we consider in our analysis is daily river discharge. We analyze overall water availability and flow regime, including seasonality, high flows and low flows. Scaling effects are investigated by separately looking at discharge generated by global and regional hydrological models respectively. Finally, we compare our results with other recently published studies. We find that small differences in global temperature rise associated with some emissions scenarios have mostly significant impacts on river discharge—however, climate model related uncertainty is so large that it obscures the sensitivity of the hydrological system.

  11. Stochastic methods for uncertainty treatment of functional variables in computer codes: application to safety studies

    Nanty, Simon

    2015-01-01

    This work relates to the framework of uncertainty quantification for numerical simulators, and more precisely studies two industrial applications linked to the safety studies of nuclear plants. These two applications have several common features. The first one is that the computer code inputs are functional and scalar variables, functional ones being dependent. The second feature is that the probability distribution of functional variables is known only through a sample of their realizations. The third feature, relative to only one of the two applications, is the high computational cost of the code, which limits the number of possible simulations. The main objective of this work was to propose a complete methodology for the uncertainty analysis of numerical simulators for the two considered cases. First, we have proposed a methodology to quantify the uncertainties of dependent functional random variables from a sample of their realizations. This methodology enables to both model the dependency between variables and their link to another variable, called co-variate, which could be, for instance, the output of the considered code. Then, we have developed an adaptation of a visualization tool for functional data, which enables to simultaneously visualize the uncertainties and features of dependent functional variables. Second, a method to perform the global sensitivity analysis of the codes used in the two studied cases has been proposed. In the case of a computationally demanding code, the direct use of quantitative global sensitivity analysis methods is intractable. To overcome this issue, the retained solution consists in building a surrogate model or meta model, a fast-running model approximating the computationally expensive code. An optimized uniform sampling strategy for scalar and functional variables has been developed to build a learning basis for the meta model. Finally, a new approximation approach for expensive codes with functional outputs has been

  12. Uncertainty and Cognitive Control

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  13. Evaluating the influence of setup uncertainties on treatment planning for focal liver tumors

    Balter, J.M.; Brock, K.K.; Lam, K.L.; Dawson, L.A.; McShan, D.L.; Ten Haken, R.K.

    2001-01-01

    Purpose: A mechanism has been developed to evaluate the influence of systematic and random setup variations on dose during treatment planning. The information available for studying these factors shifts from population-based models towards patient-specific data as treatment progresses and setup measurements for an individual patient become available. This study evaluates the influence of population as well as patient-specific setup distributions on treatment plans for focal liver tumors. Materials and Methods: 8 patients with focal liver tumors were treated on a protocol that involved online setup measurement and adjustment, as well as ventilatory immobilization. Summary statistics from these treatments yielded individual and population distributions of position at initial setup for each fraction as well as after setup adjustment. A convolution model for evaluation of the influence of random setup variation on calculated dose distributions has been previously described and investigated for application to focal liver radiotherapy by our department. Individual patient doses based on initial setup positions were calculated by applying the measured systematic offset to the initial treatment plan, and then convolving the calculated dose distribution with an anisotropic probability distribution function representing the individual patient's random variations. A separate calculation with no offset and convolution using population averaged random variations was performed. Individual beam apertures were then adjusted to provide plans that ensured proper dose to the clinical target volume (CTV) following convolution with population distributions prior to and following setup adjustment. Results: Input distributions comprised 262 position measurements. Individual patient setup distributions for the course of treatment had systematic offsets ranging from (σ) 1.1 to 4.1 mm (LR), -2.0 to 1.4 mm (AP), and 5.6 to 1.7 mm (IS). Individual random setup variations ranged from 2.5 to 5

  14. PTRACK: A particle tracking program for evaluation travel path/travel time uncertainties

    Thompson, B.M.; Campbell, J.E.; Longsine, D.E.

    1987-12-01

    PTRACK is a model which tracks the path of a radionuclide particle released from a nuclear waste repository into a ground-water flow system in a two-dimensional representation of stratified geologic medium. The code calculates the time required for the particle to travel from the release point (the edge of the disturbed zone) to the specified horizontal or vertical boundary (the accessible environment). The physical properties of the geologic setting and the ground-water flow system can be treated as fixed values or as random variables sampled from their respective probability distributions. In the latter case, PTRACK assigns a sampled value for each parameter and tracks a particle for this trial (realization) of the system. Repeated realizations allow the effects of parameter uncertainty on travel paths/travel times to be quantified. The code can also calculate partial correlation coefficients between dependent variables and independent variables, which are useful in identifying important independent variables. This documentation describes the mathematical basis for the model, the algorithms and solution techniques used, and the computer code design. It also contains a detailed user's manual. The implementation of PTRACK is verified with several systems for which solutions have been calculated by hand. The integration of PTRACK with a Latin hypercube sampling (LHS) code is also discussed, although other sampling methods can be employed in place of LHS. 11 refs., 14 figs., 22 tabs

  15. Risk and Uncertainties, Analysis and Evaluation: Lessons for Adaptation and Integration

    Yohe, G.; Dowlatabadi, H.

    1999-01-01

    This paper draws ten lessons from analyses of adaptation to climate change under conditions of risk and uncertainty: (1) Socio-economic systems will likely respond most to extreme realizations of climate change. (2) Systems have been responding to variations in climate for centuries. (3) Future change will effect future citizens and their institutions. (4) Human systems can be the sources of surprise. (5) Perceptions of risk depend upon welfare valuations that depend upon expectations. (6) Adaptive decisions will be made in response to climate change and climate change policy. (7) Analysis of adaptive decisions should recognize the second-best context of those decisions. (8) Climate change offers opportunity as well as risk. (9) All plausible futures should be explored. (10) Multiple methodological approaches should be accommodated. These lessons support two pieces of advice for the Third Assessment Report: (1) Work toward consensus, but not at the expense of thorough examination and reporting of the 'tails' of the distributions of the future. (2) Integrated assessment is only one unifying methodology; others that can better accommodate those tails should be encouraged and embraced. 12 refs

  16. Strontium isotopes and the reconstruction of the Chaco regional system: evaluating uncertainty with Bayesian mixing models.

    Brandon Lee Drake

    Full Text Available Strontium isotope sourcing has become a common and useful method for assigning sources to archaeological artifacts.In Chaco Canyon, an Ancestral Pueblo regional center in New Mexico, previous studiesusing these methods have suggested that significant portion of maize and wood originate in the Chuska Mountains region, 75 km to the West [corrected]. In the present manuscript, these results were tested using both frequentist methods (to determine if geochemical sources can truly be differentiated and Bayesian methods (to address uncertainty in geochemical source attribution. It was found that Chaco Canyon and the Chuska Mountain region are not easily distinguishable based on radiogenic strontium isotope values. The strontium profiles of many geochemical sources in the region overlap, making it difficult to definitively identify any one particular geochemical source for the canyon's pre-historic maize. Bayesian mixing models support the argument that some spruce and fir wood originated in the San Mateo Mountains, but that this cannot explain all 87Sr/86Sr values in Chaco timber. Overall radiogenic strontium isotope data do not clearly identify a single major geochemical source for maize, ponderosa, and most spruce/fir timber. As such, the degree to which Chaco Canyon relied upon outside support for both food and construction material is still ambiguous.

  17. Examining uncertainties in the linkage between global climate change and potential human health impacts in the western USA -- Hexachlorobenzene (HCB) as a case study

    McKone, T.E.; Daniels, J.I. [Lawrence Livermore National Lab., CA (United States); Goldman, M. [Univ. of California, Davis, CA (United States)

    1994-09-30

    Industrial societies have altered the earth`s environment in ways that could have important, long-term ecological, economic, and health implications. In this paper the authors define, characterize, and evaluate parameter and outcome uncertainties using a model that links global climate change with predictions of chemical exposure and human health risk in the western region of the US. They illustrate the impact of uncertainty about global climate change on such potential secondary outcomes using as a case study the public health consequences related to the behavior environmentally of hexachlorobenzene (HCB), an ubiquitous multimedia pollutant. They begin by constructing a matrix that reveals the linkage between global environmental change and potential regional human-health effects that might be induced directly and/or indirectly by HCB released into the air and water. This matrix is useful for translating critical uncertainties into terms that can be understood and used by policy makers to formulate strategies against potential adverse irreversible health and economic consequences. Specifically, the authors employ a combined uncertainty/sensitivity analysis to investigate how the HCB that has been released is affected by increasing atmospheric temperature and the accompanying climate alterations that are anticipated and how such uncertainty propagates to affect the expected magnitude and calculational precision of estimates of associated potential human exposures and health effects.

  18. Multi-criteria evaluation of wastewater treatment plant control strategies under uncertainty

    Flores Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan

    2008-01-01

    The evaluation of activated sludge control strategies in wastewater treatment plants (WWTP) via mathematical modelling is a complex activity because several objectives; e.g. economic, environmental, technical and legal; must be taken into account at the same time, i.e. the evaluation of the alter...

  19. Multi-criteria group decision making for evaluating the performance of e-waste recycling programs under uncertainty.

    Wibowo, Santoso; Deng, Hepu

    2015-06-01

    This paper presents a multi-criteria group decision making approach for effectively evaluating the performance of e-waste recycling programs under uncertainty in an organization. Intuitionistic fuzzy numbers are used for adequately representing the subjective and imprecise assessments of the decision makers in evaluating the relative importance of evaluation criteria and the performance of individual e-waste recycling programs with respect to individual criteria in a given situation. An interactive fuzzy multi-criteria decision making algorithm is developed for facilitating consensus building in a group decision making environment to ensure that all the interest of individual decision makers have been appropriately considered in evaluating alternative e-waste recycling programs with respect to their corporate sustainability performance. The developed algorithm is then incorporated into a multi-criteria decision support system for making the overall performance evaluation process effectively and simple to use. Such a multi-criteria decision making system adequately provides organizations with a proactive mechanism for incorporating the concept of corporate sustainability into their regular planning decisions and business practices. An example is presented for demonstrating the applicability of the proposed approach in evaluating the performance of e-waste recycling programs in organizations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    weighted least-squares objective function is minimized with respect to the parameter values using a modified Gauss-Newton method or a double-dogleg technique. Sensitivities needed for the method can be read from files produced by process models that can calculate sensitivities, such as MODFLOW-2000, or can be calculated by UCODE_2005 using a more general, but less accurate, forward- or central-difference perturbation technique. Problems resulting from inaccurate sensitivities and solutions related to the perturbation techniques are discussed in the report. Statistics are calculated and printed for use in (1) diagnosing inadequate data and identifying parameters that probably cannot be estimated; (2) evaluating estimated parameter values; and (3) evaluating how well the model represents the simulated processes. Results from UCODE_2005 and codes RESIDUAL_ANALYSIS and RESIDUAL_ANALYSIS_ADV can be used to evaluate how accurately the model represents the processes it simulates. Results from LINEAR_UNCERTAINTY can be used to quantify the uncertainty of model simulated values if the model is sufficiently linear. Results from MODEL_LINEARITY and MODEL_LINEARITY_ADV can be used to evaluate model linearity and, thereby, the accuracy of the LINEAR_UNCERTAINTY results. UCODE_2005 can also be used to calculate nonlinear confidence and predictions intervals, which quantify the uncertainty of model simulated values when the model is not linear. CORFAC_PLUS can be used to produce factors that allow intervals to account for model intrinsic nonlinearity and small-scale variations in system characteristics that are not explicitly accounted for in the model or the observation weighting. The six post-processing programs are independent of UCODE_2005 and can use the results of other programs that produce the required data-exchange files. UCODE_2005 and the other six codes are intended for use on any computer operating system. The programs con

  1. Ecological forecasting under climatic data uncertainty: a case study in phenological modeling

    Cook, Benjamin I; Terando, Adam; Steiner, Allison

    2010-01-01

    Forecasting ecological responses to climate change represents a challenge to the ecological community because models are often site-specific and climate data are lacking at appropriate spatial and temporal resolutions. We use a case study approach to demonstrate uncertainties in ecological predictions related to the driving climatic input data. We use observational records, derived observational datasets (e.g. interpolated observations from local weather stations and gridded data products) and output from general circulation models (GCM) in conjunction with site based phenology models to estimate the first flowering date (FFD) for three woody flowering species. Using derived observations over the modern time period, we find that cold biases and temperature trends lead to biased FFD simulations for all three species. Observational datasets resolved at the daily time step result in better FFD predictions compared to simulations using monthly resolution. Simulations using output from an ensemble of GCM and regional climate models over modern and future time periods have large intra-ensemble spreads and tend to underestimate observed FFD trends for the modern period. These results indicate that certain forcing datasets may be missing key features needed to generate accurate hindcasts at the local scale (e.g. trends, temporal resolution), and that standard modeling techniques (e.g. downscaling, ensemble mean, etc) may not necessarily improve the prediction of the ecological response. Studies attempting to simulate local ecological processes under modern and future climate forcing therefore need to quantify and propagate the climate data uncertainties in their simulations.

  2. Assessment of the uncertainty associated with systematic errors in digital instruments: an experimental study on offset errors

    Attivissimo, F; Giaquinto, N; Savino, M; Cataldo, A

    2012-01-01

    This paper deals with the assessment of the uncertainty due to systematic errors, particularly in A/D conversion-based instruments. The problem of defining and assessing systematic errors is briefly discussed, and the conceptual scheme of gauge repeatability and reproducibility is adopted. A practical example regarding the evaluation of the uncertainty caused by the systematic offset error is presented. The experimental results, obtained under various ambient conditions, show that modelling the variability of systematic errors is more problematic than suggested by the ISO 5725 norm. Additionally, the paper demonstrates the substantial difference between the type B uncertainty evaluation, obtained via the maximum entropy principle applied to manufacturer's specifications, and the type A (experimental) uncertainty evaluation, which reflects actually observable reality. Although it is reasonable to assume a uniform distribution of the offset error, experiments demonstrate that the distribution is not centred and that a correction must be applied. In such a context, this work motivates a more pragmatic and experimental approach to uncertainty, with respect to the directions of supplement 1 of GUM. (paper)

  3. Uncertainty analysis

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  4. Uncertainty analysis guide

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  5. Uncertainty analysis guide

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  6. Evaluation of uncertainties of key neutron parameters of PWR-type reactors with slab fuel, application to neutronic conformity; Determination des incertitudes liees aux grandeurs neutroniques d'interet des reacteurs a eau pressurisee a plaques combustibles et application aux etudes de conformite

    Bernard, D

    2001-12-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and life-time. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then, neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimized. (author)

  7. Evaluation of uncertainties of key neutron parameters of PWR-type reactors with slab fuel, application to neutronic conformity; Determination des incertitudes liees aux grandeurs neutroniques d'interet des reacteurs a eau pressurisee a plaques combustibles et application aux etudes de conformite

    Bernard, D

    2001-12-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and life-time. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then, neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimized. (author)

  8. SU-F-T-115: Uncertainty in the Esophagus Dose in Retrospective Epidemiological Study of Breast Cancer Radiotherapy Patients

    Mosher, E; Kim, S; Lee, C [Division of Cancer Epidemiology and Genetics, National Cancer Institute, Rockville, MD (United States); Lee, C [Department of Radiation Oncology, University of Michigan, Ann Arbor, MI (United States); Pelletier, C; Jung, J [Department of Physics, East Carolina University Greenville, NC (United States); Jones, E [Radiology and Imaging Sciences Clinical Center, National Institutes of Health, Bethesda, MD (United States)

    2016-06-15

    Purpose: Epidemiological studies of second cancer risks in breast cancer radiotherapy patients often use generic patient anatomy to reconstruct normal tissue doses when CT images of patients are not available. To evaluate the uncertainty involved in the dosimetry approach, we evaluated the esophagus dose in five sample patients by simulating breast cancer treatments. Methods: We obtained the diagnostic CT images of five anonymized adult female patients in different Body Mass Index (BMI) categories (16– 36kg/m2) from National Institutes of Health Clinical Center. We contoured the esophagus on the CT images and imported them into a Treatment Planning System (TPS) to create treatment plans and calculate esophagus doses. Esophagus dose was calculated once again via experimentally-validated Monte Carlo (MC) transport code, XVMC under the same geometries. We compared the esophagus doses from TPS and the MC method. We also investigated the degree of variation in the esophagus dose across the five patients and also the relationship between the patient characteristics and the esophagus doses. Results: Eclipse TPS using Analytical Anisotropic Algorithm (AAA) significantly underestimates the esophagus dose in breast cancer radiotherapy compared to MC. In the worst case, the esophagus dose from AAA was only 40% of the MC dose. The Coefficient of Variation across the patients was 48%. We found that the maximum esophagus dose was up to 2.7 times greater than the minimum. We finally observed linear relationship (Dose = 0.0218 × BMI – 0.1, R2=0.54) between patient’s BMI and the esophagus doses. Conclusion: We quantified the degree of uncertainty in the esophagus dose in five sample breast radiotherapy patients. The results of the study underscore the importance of individualized dose reconstruction for the study cohort to avoid misclassification in the risk analysis of second cancer. We are currently extending the number of patients up to 30.

  9. SU-F-T-115: Uncertainty in the Esophagus Dose in Retrospective Epidemiological Study of Breast Cancer Radiotherapy Patients

    Mosher, E; Kim, S; Lee, C; Lee, C; Pelletier, C; Jung, J; Jones, E

    2016-01-01

    Purpose: Epidemiological studies of second cancer risks in breast cancer radiotherapy patients often use generic patient anatomy to reconstruct normal tissue doses when CT images of patients are not available. To evaluate the uncertainty involved in the dosimetry approach, we evaluated the esophagus dose in five sample patients by simulating breast cancer treatments. Methods: We obtained the diagnostic CT images of five anonymized adult female patients in different Body Mass Index (BMI) categories (16– 36kg/m2) from National Institutes of Health Clinical Center. We contoured the esophagus on the CT images and imported them into a Treatment Planning System (TPS) to create treatment plans and calculate esophagus doses. Esophagus dose was calculated once again via experimentally-validated Monte Carlo (MC) transport code, XVMC under the same geometries. We compared the esophagus doses from TPS and the MC method. We also investigated the degree of variation in the esophagus dose across the five patients and also the relationship between the patient characteristics and the