WorldWideScience

Sample records for major uncertainties remain

  1. Calibration of C-14 dates: some remaining uncertainties and limitations

    Burleigh, R.

    1975-01-01

    A brief review is presented of the interpretation of radiocarbon dates in terms of calendar years. An outline is given of the factors that make such correlations necessary and of the work that has so far been done to make them possible. The calibration of the C-14 timescale very largely depends at present on the bristlecone pine chronology, but it is clear that many detailed uncertainties still remain. These are discussed. (U.K.)

  2. Majorization uncertainty relations for mixed quantum states

    Puchała, Zbigniew; Rudnicki, Łukasz; Krawiec, Aleksandra; Życzkowski, Karol

    2018-04-01

    Majorization uncertainty relations are generalized for an arbitrary mixed quantum state ρ of a finite size N. In particular, a lower bound for the sum of two entropies characterizing the probability distributions corresponding to measurements with respect to two arbitrary orthogonal bases is derived in terms of the spectrum of ρ and the entries of a unitary matrix U relating both bases. The results obtained can also be formulated for two measurements performed on a single subsystem of a bipartite system described by a pure state, and consequently expressed as an uncertainty relation for the sum of conditional entropies.

  3. Road safety: serious injuries remain a major unsolved problem.

    Beck, Ben; Cameron, Peter A; Fitzgerald, Mark C; Judson, Rodney T; Teague, Warwick; Lyons, Ronan A; Gabbe, Belinda J

    2017-09-18

    To investigate temporal trends in the incidence, mortality, disability-adjusted life-years (DALYs), and costs of health loss caused by serious road traffic injury. A retrospective review of data from the population-based Victorian State Trauma Registry and the National Coronial Information System on road traffic-related deaths (pre- and in-hospital) and major trauma (Injury Severity Score > 12) during 2007-2015.Main outcomes and measures: Temporal trends in the incidence of road traffic-related major trauma, mortality, DALYs, and costs of health loss, by road user type. There were 8066 hospitalised road traffic major trauma cases and 2588 road traffic fatalities in Victoria over the 9-year study period. There was no change in the incidence of hospitalised major trauma for motor vehicle occupants (incidence rate ratio [IRR] per year, 1.00; 95% CI, 0.99-1.01; P = 0.70), motorcyclists (IRR, 0.99; 95% CI, 0.97-1.01; P = 0.45) or pedestrians (IRR, 1.00; 95% CI, 0.97-1.02; P = 0.73), but the incidence for pedal cyclists increased 8% per year (IRR, 1.08; 95% CI; 1.05-1.10; P road traffic injuries exceeded $14 billion during 2007-2015, although the cost per patient declined for all road user groups. As serious injury rates have not declined, current road safety targets will be difficult to meet. Greater attention to preventing serious injury is needed, as is further investment in road safety, particularly for pedal cyclists.

  4. Remaining uncertainties in predicting long-term performance of nuclear waste glass from experiments

    Grambow, B.

    1994-01-01

    The current knowledge on the glass dissolution mechanism and the representation of glass dissolution concepts within overall repository performance assessment models are briefly summarized and uncertainties related to mechanism, radionuclide chemistry and parameters are discussed. Understanding of the major glass dissolution processes has been significantly increased in recent years. Long-term glass stability is related to the long-term maintenance of silica saturated conditions. The behavior of individual radionuclides in the presence of a dissolving glass has not been sufficiently and results do no yet allow meaningful predictions. Conserving long-term predictions of glass matrix dissolution as upper limit for radionuclide release can be made with sufficient confidence, however these estimations generally result in a situation where the barrier function of the glass is masked by the efficiency of the geologic barrier. Realistic long-term predictions may show that the borosilicate waste glass contributes to overall repository safety to a much larger extent than indicated by overconservatism. Today realistic predictions remain highly uncertain and much more research work is necessary. In particular, the long-term rate under silica saturated conditions needs to be understood and the behavior of individual radionuclides in the presence of a dissolving glass deserves more systematic investigations

  5. Biomass Burning: Major Uncertainties, Advances, and Opportunities

    Yokelson, R. J.; Stockwell, C.; Veres, P. R.; Hatch, L. E.; Barsanti, K. C.; Liu, X.; Huey, L. G.; Ryerson, T. B.; Dibb, J. E.; Wisthaler, A.; Müller, M.; Alvarado, M. J.; Kreidenweis, S. M.; Robinson, A. L.; Toon, O. B.; Peischl, J.; Pollack, I. B.

    2014-12-01

    Domestic and open biomass burning are poorly-understood, major influences on Earth's atmosphere composed of countless individual fires that (along with their products) are difficult to quantify spatially and temporally. Each fire is a minimally-controlled complex phenomenon producing a diverse suite of gases and aerosols that experience many different atmospheric processing scenarios. New lab, airborne, and space-based observations along with model and algorithm development are significantly improving our knowledge of biomass burning. Several campaigns provided new detailed emissions profiles for previously undersampled fire types; including wildfires, cooking fires, peat fires, and agricultural burning; which may increase in importance with climate change and rising population. Multiple campaigns have better characterized black and brown carbon and used new instruments such as high resolution PTR-TOF-MS and 2D-GC/TOF-MS to improve quantification of semi-volatile precursors to aerosol and ozone. The aerosol evolution and formation of PAN and ozone, within hours after emission, have now been measured extensively. The NASA DC-8 sampled smoke before and after cloud-processing in two campaigns. The DC-8 performed continuous intensive sampling of a wildfire plume from the source in California to Canada probing multi-day aerosol and trace gas aging. Night-time plume chemistry has now been measured in detail. Fire inventories are being compared and improved, as is modeling of mass transfer between phases and sub-grid photochemistry for global models.

  6. The CRC 20 years: An overview of some of the major achievements and remaining challenges.

    Doek, Jaap E

    2009-11-01

    On 20 November 1989, the General Assembly of the United Nations adopted the Convention on the Rights of the Child (CRC). It entered into force on 2 September 1990 and has by now been ratified by 193 States, making the most universally ratified human rights treaty. This overview will present and discuss the impact of this treaty both at the international and the national level, an overview which necessarily has to be limited to some of the developments as a result of the implementation of the CRC. The first part of this paper will be devoted to the impact the CRC had and still has on the setting and development of the international agenda for the promotion and protection of the rights and welfare of children. Special attention will given to developments, achievements, and remaining challenges at the international level with regard to protection of children in armed conflict; prevention and the protection of children from sexual exploitation; and from all forms of violence. This will include some information on the impact of these international developments and actions at the national level, for example, in the area of legislation. The second part will focus on the impact at the national level. Given the wide scope of the CRC this part will be limited to some of the General Measures of Implementation (law reform, national programmes, and independent monitoring) and the General Principles (non-discrimination, best interest, right to be heard) of the CRC. This will be based on reports of States on the implementation of the CRC submitted to the CRC Committee and the Concluding Observations of this Committee and on a number of studies. The conclusion will provide remarks on poverty as one of the major remaining challenges for the implementation of children's rights.

  7. Enhanced tumor growth in the remaining lung after major lung resection.

    Sano, Fumiho; Ueda, Kazuhiro; Murakami, Junichi; Hayashi, Masataro; Nishimoto, Arata; Hamano, Kimikazu

    2016-05-01

    Pneumonectomy induces active growth of the remaining lung in order to compensate for lost lung tissue. We hypothesized that tumor progression is enhanced in the activated local environment. We examined the effects of mechanical strain on the activation of lung growth and tumor progression in mice. The mechanical strain imposed on the right lung after left pneumonectomy was neutralized by filling the empty space that remained after pneumonectomy with a polypropylene prosthesis. The neutralization of the strain prevented active lung growth. According to an angiogenesis array, stronger monocyte chemoattractant protein-1 (MCP-1) expression was found in the strain-induced growing lung. The neutralization of the strain attenuated the release of MCP-1 from the lung cells. The intravenous injection of Lewis lung cancer cells resulted in the enhanced development of metastatic foci in the strain-induced growing lung, but the enhanced development was canceled by the neutralization of the strain. An immunohistochemical analysis revealed the prominent accumulation of tumor-associated macrophages in tumors arising in the strain-induced growing lung, and that there was a relationship between the accumulation and the MCP-1 expression status. Our results suggested that mechanical lung strain, induced by pulmonary resection, triggers active lung growth, thereby creating a tumor-friendly environment. The modification of that environment, as well as the minimizing of surgical stress, may be a meaningful strategy to improve the therapeutic outcome after lung cancer surgery. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Investigation of the remaining major and trace elements in clean coal generated by organic solvent extraction

    Jie Wang; Chunqi Li; Kinya Sakanishi; Tetsuya Nakazato; Hiroaki Tao; Toshimasa Takanohashi; Takayuki Takarada; Ikuo Saito [National Institute Advanced Industrial Science and Technology (AIST), Ibaraki (Japan). Energy Technology Research Institute

    2005-09-01

    A sub-bituminous Wyodak coal (WD coal) and a bituminous Illinois No. 6 coal (IL coal) were thermally extracted with 1-methylnaphthalene (1-MN) and N-methyl-2-pyrrolidone (NMP) to produce clean extract. A mild pretreatment with acetic acid was also carried out. Major and trace inorganic elements in the raw coals and resultant extracts were determined by means of inductively coupled plasma optical emission spectrometry (ICP-OES), flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS), and cold vapor atomic absorption spectrometry (CV-AAS). It was found that the extraction with 1-MN resulted in 73-100% reductions in the concentration of Li, Be, V, Ga, As, Se, Sr, Cd, Ba, Hg, and Pb. The extraction with NMP yielded more extract than that with 1-MN, but it retained more organically associated major and trace metals in the extracts. In the extraction of WD coal with NMP, the acid pretreatment not only significantly enhanced the extraction yield but also significantly reduced the concentrations of alkaline earth elements such as Be, Ca, Mg, Sr, and Ba in the extract. In addition, the modes of occurrence of trace elements in the coals were discussed according to their extraction behaviors. 30 refs., 2 figs., 5 tabs.

  9. Intolerance of uncertainty mediates reduced reward anticipation in major depressive disorder.

    Nelson, Brady D; Shankman, Stewart A; Proudfit, Greg H

    2014-04-01

    Reduced reward sensitivity has long been considered a fundamental deficit of major depressive disorder (MDD). One way this deficit has been measured is by an asymmetry in electroencephalogram (EEG) activity between left and right frontal brain regions. MDD has been associated with a reduced frontal EEG asymmetry (i.e., decreased left relative to right) while anticipating reward. However, the mechanism (or mediator) of this association is unclear. The present study examined whether intolerance of uncertainty (IU) mediated the association between depression and reduced reward anticipation. Data were obtained from a prior study reporting reduced frontal EEG asymmetry while anticipating reward in early-onset MDD. Participants included 156 individuals with early-onset MDD-only, panic disorder-only, both (comorbids), or controls. Frontal EEG asymmetry was recorded during an uncertain reward anticipation task. Participants completed a self-report measure of IU. All three psychopathology groups reported greater IU relative to controls. Across all participants, greater IU was associated with a reduced frontal EEG asymmetry. Furthermore, IU mediated the relationship between MDD and frontal EEG asymmetry and results remained significant after controlling for neuroticism, suggesting effects were not due to broad negative affectivity. MDD participants were limited to those with early-onset depression. Measures were collected cross-sectionally, precluding causal relationships. IU mediated the relationship between MDD and reduced reward anticipation, independent of neuroticism. Explanations are provided regarding how IU may contribute to reduced reward anticipation in depression. Overall, IU appears to be an important mechanism for the association between depression and reduced reward anticipation. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Uncertainties

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  11. Uncertainty

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  12. Medicare covers the majority of FDA-approved devices and Part B drugs, but restrictions and discrepancies remain.

    Chambers, James D; May, Katherine E; Neumann, Peter J

    2013-06-01

    The Food and Drug Administration (FDA) and Medicare use different standards to determine, first, whether a new drug or medical device can be marketed to the public and, second, if the federal health insurance program will pay for use of the drug or device. This discrepancy creates hurdles and uncertainty for drug and device manufacturers. We analyzed discrepancies between FDA approval and Medicare national coverage determinations for sixty-nine devices and Part B drugs approved during 1999-2011. We found that Medicare covered FDA-approved drugs or devices 80 percent of the time. However, Medicare often added conditions beyond FDA approval, particularly for devices and most often restricting coverage to patients with the most severe disease. In some instances, Medicare was less restrictive than the FDA. Our findings highlight the importance for drug and device makers of anticipating Medicare's needs when conducting clinical studies to support their products. Our findings also provide important insights for the FDA's and Medicare's pilot parallel review program.

  13. Empirical estimates to reduce modeling uncertainties of soil organic carbon in permafrost regions: a review of recent progress and remaining challenges

    Mishra, U; Jastrow, J D; Matamala, R; Fan, Z; Miller, R M; Hugelius, G; Kuhry, P; Koven, C D; Riley, W J; Harden, J W; Ping, C L; Michaelson, G J; McGuire, A D; Tarnocai, C; Schaefer, K; Schuur, E A G; Jorgenson, M T; Hinzman, L D

    2013-01-01

    The vast amount of organic carbon (OC) stored in soils of the northern circumpolar permafrost region is a potentially vulnerable component of the global carbon cycle. However, estimates of the quantity, decomposability, and combustibility of OC contained in permafrost-region soils remain highly uncertain, thereby limiting our ability to predict the release of greenhouse gases due to permafrost thawing. Substantial differences exist between empirical and modeling estimates of the quantity and distribution of permafrost-region soil OC, which contribute to large uncertainties in predictions of carbon–climate feedbacks under future warming. Here, we identify research challenges that constrain current assessments of the distribution and potential decomposability of soil OC stocks in the northern permafrost region and suggest priorities for future empirical and modeling studies to address these challenges. (letter)

  14. Fee-for-service will remain a feature of major payment reforms, requiring more changes in Medicare physician payment.

    Ginsburg, Paul B

    2012-09-01

    Many health policy analysts envision provider payment reforms currently under development as replacements for the traditional fee-for-service payment system. Reforms include per episode bundled payment and elements of capitation, such as global payments or accountable care organizations. But even if these approaches succeed and are widely adopted, the core method of payment to many physicians for the services they provide is likely to remain fee-for-service. It is therefore critical to address the current shortcomings in the Medicare physician fee schedule, because it will affect physician incentives and will continue to play an important role in determining the payment amounts under payment reform. This article reviews how the current payment system developed and is applied, and it highlights areas that require careful review and modification to ensure the success of broader payment reform.

  15. Consequences of major nuclear accidents on wild fauna and flora: dosimetric assessments remain a weakness to establish robust conclusions

    2014-01-01

    As about hundred of studies have been undertaken after the major nuclear accidents (Chernobyl and Fukushima) to study the consequences of these accidents on wild flora and fauna, notably on the effects of low doses of ionizing radiations, it appears that some of them reported noticeable effects due to extremely low doses. Such findings put knowledge in radiobiology into question again. This note aims at discussing the importance of the quality of dosimetric assessments for any study performed 'in natura'. It seems that the ambient external dose rate is not systematically a good indicator of the dose or dose rate absorbed by a living organism in radio-contaminated environment. This note outlines the problem related to the spatial heterogeneity of the radioactive contamination, that some statistic methods are not always adapted to data set quality. It briefly indicates other factors which may affect the quality of data set obtained during in situ studies

  16. The French program on the spent nuclear fuel long term evolution: Major results, uncertainties and new requirements

    Ferry, Cecile; Poinssot, Christophe; Gras, Jean-Marie

    2006-01-01

    conditions; oxidation kinetics of UO 2 and spent fuel are assessed by coupling a microscopic approach with macroscopic classical methods; - in water, this boundary condition corresponds to the nominal scenario after the breaching of the canister during geological disposal; in this case, the effect of water a radiolysis on the spent fuel matrix dissolution is investigated. An overview of the state of knowledge on the long-term behaviour of spent fuel under these various conditions has been provided by Ferry et al. (2005). This report constitutes the scientific synthesis due at the term of the law. This state of the art was derived from the results obtained under the PRECCI project as well as from a review of the literature and some of the data issued from the 5. PRDC European project 'Spent Fuel Stability under Repository Conditions' (Poinssot et al, 2005). Regarding these results, this paper presents for each initial operational question and concept, the main scientific issues, the major results obtained since 1999, and the remaining uncertainties. The new requirements in the frame of a broader context which includes transport and in-pool storage issues are also identified. The contents of this paper is as follows: 1 Introduction; 2 The retrievability of spent fuel assemblies after storage; 2.1 Main scientific issues; 2.2 Major results; 2.2.1 Mechanical evolution of the cladding and structural materials; 2.2.2 Specific case of the defected fuel rods - Evolution of the spent fuel pellets; 2.3 Remaining uncertainties and new requirements for each operational context; 3 The treatment of spent fuel after a long period of storage; 3.1 Main scientific issues; 3.2 Major results; 3.3 Remaining uncertainties and new requirements; 4 The definition of the radionuclide source terms; 4.1 Main scientific issues; 4.2 Major results; 4.3 Remaining uncertainties and new requirements for each operational context; 5 The compatibility between dry storage and subsequent disposal; 5.1 Main

  17. Modeling of Uncertainties in Major Drivers in U.S. Electricity Markets: Preprint

    Short, W.; Ferguson, T.; Leifman, M.

    2006-09-01

    This paper presents information on the Stochastic Energy Deployment System (SEDS) model. DOE and NREL are developing this new model, intended to address many of the shortcomings of the current suite of energy models. Once fully built, the salient qualities of SEDS will include full probabilistic treatment of the major uncertainties in national energy forecasts; code compactness for desktop application; user-friendly interface for a reasonably trained analyst; run-time within limits acceptable for quick-response analysis; choice of detailed or aggregate representations; and transparency of design, code, and assumptions. Moreover, SEDS development will be increasingly collaborative, as DOE and NREL will be coordinating with multiple national laboratories and other institutions, making SEDS nearly an 'open source' project. The collaboration will utilize the best expertise on specific sectors and problems, and also allow constant examination and review of the model. This paper outlines the rationale for this project and a description of its alpha version, as well as some example results. It also describes some of the expected development efforts in SEDS.

  18. Intolerance of uncertainty, worry, and rumination in major depressive disorder and generalized anxiety disorder.

    Yook, Keunyoung; Kim, Keun-Hyang; Suh, Shin Young; Lee, Kang Soo

    2010-08-01

    Intolerance of uncertainty (IU) can be defined as a cognitive bias that affects how a person perceives, interprets, and responds to uncertain situations. Although IU has been reported mainly in literature relating to worry and anxiety symptoms, it may be also important to investigate the relationship between IU, rumination, and depression in a clinical sample. Furthermore, individuals who are intolerant of uncertainty easily experience stress and could cope with stressful situations using repetitive thought such as worry and rumination. Thus, we investigated whether different forms of repetitive thought differentially mediate the relationship between IU and psychological symptoms. Participants included 27 patients with MDD, 28 patients with GAD, and 16 patients with comorbid GAD/MDD. Even though worry, rumination, IU, anxiety, and depressive symptoms correlated substantially with each other, worry partially mediated the relationship between IU and anxiety whereas rumination completely mediated the relationship between IU and depressive symptoms. (c) 2010 Elsevier Ltd. All rights reserved.

  19. Major Results of the OECD BEMUSE (Best Estimate Methods; Uncertainty and Sensitivity Evaluation) Programme

    Reventos, F.

    2008-01-01

    One of the goals of computer code models of Nuclear Power Plants (NPP) is to demonstrate that these are designed to respond safely at postulated accidents. Models and codes are an approximation of the real physical behaviour occurring during a hypothetical transient and the data used to build these models are also known with certain accuracy. Therefore code predictions are uncertain. The BEMUSE programme is focussed on the application of uncertainty methodologies to large break LOCAs. The programme intends to evaluate the practicability, quality and reliability of best-estimate methods including uncertainty evaluations in applications relevant to nuclear reactor safety, to develop common understanding and to promote/facilitate their use by the regulator bodies and the industry. In order to fulfil its objectives BEMUSE is organized in to steps and six phases. The first step is devoted to the complete analysis of a LB-LOCA (L2-5) in an experimental facility (LOFT) while the second step refers to an actual Nuclear Power Plant. Both steps provide results on thermalhydraulic Best Estimate simulation as well as Uncertainty and sensitivity evaluation. At the time this paper is prepared, phases I, II and III are fully completed and the corresponding reports have been issued. Phase IV draft report is by now being reviewed while participants are working on Phase V developments. Phase VI consists in preparing the final status report which will summarizes the most relevant results of the whole programme.

  20. A maturation model for project-based organisations – with uncertainty management as an always remaining multi-project management focus

    Anna Jerbrant

    2014-02-01

    Full Text Available The classical view of multi-project management does not capture its dynamic nature. Present theory falls short in the expositive dimension of how management of project-based companies evolves because of their need to be agile and adaptable to a changing environment. The purpose of this paper is therefore to present a descriptive model that elucidates the maturation processes in a project-based organization as well as to give an enhanced understanding of multi-project management in practice. The maturation model displays how the management of project-based organizations evolves between structuring administration and managing any uncertainty, and emphasizes the importance of active individual actions and situated management actions that haveto be undertaken in order to coordinate, synchronize, and communicate the required knowledge and skills.The outcomes primarily reveal that, although standardized project models are used and considerable resources are spent on effective project portfolio management, how information and communication are executedis essential for the management of project-based organizations. This is particularly true for informal and non-codified communication.

  1. Work disability remains a major problem in rheumatoid arthritis in the 2000s: data from 32 countries in the QUEST-RA study.

    Sokka, Tuulikki; Kautiainen, Hannu; Pincus, Theodore; Verstappen, Suzanne M M; Aggarwal, Amita; Alten, Rieke; Andersone, Daina; Badsha, Humeira; Baecklund, Eva; Belmonte, Miguel; Craig-Müller, Jürgen; da Mota, Licia Maria Henrique; Dimic, Alexander; Fathi, Nihal A; Ferraccioli, Gianfranco; Fukuda, Wataru; Géher, Pál; Gogus, Feride; Hajjaj-Hassouni, Najia; Hamoud, Hisham; Haugeberg, Glenn; Henrohn, Dan; Horslev-Petersen, Kim; Ionescu, Ruxandra; Karateew, Dmitry; Kuuse, Reet; Laurindo, Ieda Maria Magalhaes; Lazovskis, Juris; Luukkainen, Reijo; Mofti, Ayman; Murphy, Eithne; Nakajima, Ayako; Oyoo, Omondi; Pandya, Sapan C; Pohl, Christof; Predeteanu, Denisa; Rexhepi, Mjellma; Rexhepi, Sylejman; Sharma, Banwari; Shono, Eisuke; Sibilia, Jean; Sierakowski, Stanislaw; Skopouli, Fotini N; Stropuviene, Sigita; Toloza, Sergio; Valter, Ivo; Woolf, Anthony; Yamanaka, Hisashi

    2010-01-01

    Work disability is a major consequence of rheumatoid arthritis (RA), associated not only with traditional disease activity variables, but also more significantly with demographic, functional, occupational, and societal variables. Recent reports suggest that the use of biologic agents offers potential for reduced work disability rates, but the conclusions are based on surrogate disease activity measures derived from studies primarily from Western countries. The Quantitative Standard Monitoring of Patients with RA (QUEST-RA) multinational database of 8,039 patients in 86 sites in 32 countries, 16 with high gross domestic product (GDP) (>24K US dollars (USD) per capita) and 16 low-GDP countries (<11K USD), was analyzed for work and disability status at onset and over the course of RA and clinical status of patients who continued working or had stopped working in high-GDP versus low-GDP countries according to all RA Core Data Set measures. Associations of work disability status with RA Core Data Set variables and indices were analyzed using descriptive statistics and regression analyses. At the time of first symptoms, 86% of men (range 57%-100% among countries) and 64% (19%-87%) of women <65 years were working. More than one third (37%) of these patients reported subsequent work disability because of RA. Among 1,756 patients whose symptoms had begun during the 2000s, the probabilities of continuing to work were 80% (95% confidence interval (CI) 78%-82%) at 2 years and 68% (95% CI 65%-71%) at 5 years, with similar patterns in high-GDP and low-GDP countries. Patients who continued working versus stopped working had significantly better clinical status for all clinical status measures and patient self-report scores, with similar patterns in high-GDP and low-GDP countries. However, patients who had stopped working in high-GDP countries had better clinical status than patients who continued working in low-GDP countries. The most significant identifier of work disability in

  2. Work disability remains a major problem in rheumatoid arthritis in the 2000s: data from 32 countries in the QUEST-RA Study

    Sokka, Tuulikki; Kautiainen, Hannu; Pincus, Theodore

    2010-01-01

    ,039 patients in 86 sites in 32 countries, 16 with high gross domestic product (GDP) (>24K US dollars (USD) per capita) and 16 low-GDP countries (work and disability status at onset and over the course of RA and clinical status of patients who continued working or had stopped working...... countries) and 64% (19%-87%) of women working. More than one third (37%) of these patients reported subsequent work disability because of RA. Among 1,756 patients whose symptoms had begun during the 2000s, the probabilities of continuing to work were 80% (95% confidence interval (CI) 78......ABSTRACT: INTRODUCTION: Work disability is a major consequence of rheumatoid arthritis (RA), associated not only with traditional disease activity variables, but also more significantly with demographic, functional, occupational, and societal variables. Recent reports suggest that the use...

  3. Benchmark of four popular virtual screening programs: construction of the active/decoy dataset remains a major determinant of measured performance.

    Chaput, Ludovic; Martinez-Sanz, Juan; Saettel, Nicolas; Mouawad, Liliane

    2016-01-01

    In a structure-based virtual screening, the choice of the docking program is essential for the success of a hit identification. Benchmarks are meant to help in guiding this choice, especially when undertaken on a large variety of protein targets. Here, the performance of four popular virtual screening programs, Gold, Glide, Surflex and FlexX, is compared using the Directory of Useful Decoys-Enhanced database (DUD-E), which includes 102 targets with an average of 224 ligands per target and 50 decoys per ligand, generated to avoid biases in the benchmarking. Then, a relationship between these program performances and the properties of the targets or the small molecules was investigated. The comparison was based on two metrics, with three different parameters each. The BEDROC scores with α = 80.5, indicated that, on the overall database, Glide succeeded (score > 0.5) for 30 targets, Gold for 27, FlexX for 14 and Surflex for 11. The performance did not depend on the hydrophobicity nor the openness of the protein cavities, neither on the families to which the proteins belong. However, despite the care in the construction of the DUD-E database, the small differences that remain between the actives and the decoys likely explain the successes of Gold, Surflex and FlexX. Moreover, the similarity between the actives of a target and its crystal structure ligand seems to be at the basis of the good performance of Glide. When all targets with significant biases are removed from the benchmarking, a subset of 47 targets remains, for which Glide succeeded for only 5 targets, Gold for 4 and FlexX and Surflex for 2. The performance dramatic drop of all four programs when the biases are removed shows that we should beware of virtual screening benchmarks, because good performances may be due to wrong reasons. Therefore, benchmarking would hardly provide guidelines for virtual screening experiments, despite the tendency that is maintained, i.e., Glide and Gold display better

  4. Assessing climate change impacts, benefits of mitigation, and uncertainties on major global forest regions under multiple socioeconomic and emissions scenarios

    Kim, John B.; Monier, Erwan; Sohngen, Brent; Pitts, G. Stephen; Drapek, Ray; McFarland, James; Ohrel, Sara; Cole, Jefferson

    2017-04-01

    We analyze a set of simulations to assess the impact of climate change on global forests where MC2 dynamic global vegetation model (DGVM) was run with climate simulations from the MIT Integrated Global System Model-Community Atmosphere Model (IGSM-CAM) modeling framework. The core study relies on an ensemble of climate simulations under two emissions scenarios: a business-as-usual reference scenario (REF) analogous to the IPCC RCP8.5 scenario, and a greenhouse gas mitigation scenario, called POL3.7, which is in between the IPCC RCP2.6 and RCP4.5 scenarios, and is consistent with a 2 °C global mean warming from pre-industrial by 2100. Evaluating the outcomes of both climate change scenarios in the MC2 model shows that the carbon stocks of most forests around the world increased, with the greatest gains in tropical forest regions. Temperate forest regions are projected to see strong increases in productivity offset by carbon loss to fire. The greatest cost of mitigation in terms of effects on forest carbon stocks are projected to be borne by regions in the southern hemisphere. We compare three sources of uncertainty in climate change impacts on the world’s forests: emissions scenarios, the global system climate response (i.e. climate sensitivity), and natural variability. The role of natural variability on changes in forest carbon and net primary productivity (NPP) is small, but it is substantial for impacts of wildfire. Forest productivity under the REF scenario benefits substantially from the CO2 fertilization effect and that higher warming alone does not necessarily increase global forest carbon levels. Our analysis underlines why using an ensemble of climate simulations is necessary to derive robust estimates of the benefits of greenhouse gas mitigation. It also demonstrates that constraining estimates of climate sensitivity and advancing our understanding of CO2 fertilization effects may considerably reduce the range of projections.

  5. Estimation of Source Parameters of Historical Major Earthquakes from 1900 to 1970 around Asia and Analysis of Their Uncertainties

    Han, J.; Zhou, S.

    2017-12-01

    Asia, located in the conjoined areas of Eurasian, Pacific, and Indo-Australian plates, is the continent with highest seismicity. Earthquake catalogue on the bases of modern seismic network recordings has been established since around 1970 in Asia and the earthquake catalogue before 1970 was much more inaccurate because of few stations. With a history of less than 50 years of modern earthquake catalogue, researches in seismology are quite limited. After the appearance of improved Earth velocity structure model, modified locating method and high-accuracy Optical Character Recognition technique, travel time data of earthquakes from 1900 to 1970 can be included in research and more accurate locations can be determined for historical earthquakes. Hence, parameters of these historical earthquakes can be obtained more precisely and some research method such as ETAS model can be used in a much longer time scale. This work focuses on the following three aspects: (1) Relocating more than 300 historical major earthquakes (M≥7.0) in Asia based on the Shide Circulars, International Seismological Summary and EHB Bulletin instrumental records between 1900 and 1970. (2) Calculating the focal mechanisms of more than 50 events by first motion records of P wave of ISS. (3) Based on the geological data, tectonic stress field and the result of relocation, inferring focal mechanisms of historical major earthquakes.

  6. A meta-analysis of the relation of intolerance of uncertainty to symptoms of generalized anxiety disorder, major depressive disorder, and obsessive-compulsive disorder.

    Gentes, Emily L; Ruscio, Ayelet Meron

    2011-08-01

    Intolerance of uncertainty (IU) has been suggested to reflect a specific risk factor for generalized anxiety disorder (GAD), but there have been no systematic attempts to evaluate the specificity of IU to GAD. This meta-analysis examined the cross-sectional association of IU with symptoms of GAD, major depressive disorder (MDD), and obsessive-compulsive disorder (OCD). Random effects analyses were conducted for two common definitions of IU, one that has predominated in studies of GAD (56 effect sizes) and another that has been favored in studies of OCD (29 effect sizes). Using the definition of IU developed for GAD, IU shared a mean correlation of .57 with GAD, .53 with MDD, and .50 with OCD. Using the alternate definition developed for OCD, IU shared a mean correlation of .46 with MDD and .42 with OCD, with no studies available for GAD. Post-hoc significance tests revealed that IU was more strongly related to GAD than to OCD when the GAD-specific definition of IU was used. No other differences were found in the magnitude of associations between IU and the three syndromes. We discuss implications of these findings for models of shared and specific features of emotional disorders and for future research efforts. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. [PALEOPATHOLOGY OF HUMAN REMAINS].

    Minozzi, Simona; Fornaciari, Gino

    2015-01-01

    Many diseases induce alterations in the human skeleton, leaving traces of their presence in ancient remains. Paleopathological examination of human remains not only allows the study of the history and evolution of the disease, but also the reconstruction of health conditions in the past populations. This paper describes the most interesting diseases observed in skeletal samples from the Roman Imperial Age necropoles found in urban and suburban areas of Rome during archaeological excavations in the last decades. The diseases observed were grouped into the following categories: articular diseases, traumas, infections, metabolic or nutritional diseases, congenital diseases and tumours, and some examples are reported for each group. Although extensive epidemiological investigation in ancient skeletal records is impossible, the palaeopathological study allowed to highlight the spread of numerous illnesses, many of which can be related to the life and health conditions of the Roman population.

  8. Uncertainty and Climate Change

    Berliner, L. Mark

    2003-01-01

    Anthropogenic, or human-induced, climate change is a critical issue in science and in the affairs of humankind. Though the target of substantial research, the conclusions of climate change studies remain subject to numerous uncertainties. This article presents a very brief review of the basic arguments regarding anthropogenic climate change with particular emphasis on uncertainty.

  9. Uncertainty in artificial intelligence

    Kanal, LN

    1986-01-01

    How to deal with uncertainty is a subject of much controversy in Artificial Intelligence. This volume brings together a wide range of perspectives on uncertainty, many of the contributors being the principal proponents in the controversy.Some of the notable issues which emerge from these papers revolve around an interval-based calculus of uncertainty, the Dempster-Shafer Theory, and probability as the best numeric model for uncertainty. There remain strong dissenting opinions not only about probability but even about the utility of any numeric method in this context.

  10. Conditional uncertainty principle

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  11. Uncertainties in radioecological assessment models

    Hoffman, F.O.; Miller, C.W.; Ng, Y.C.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables

  12. Understanding uncertainty

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  13. Measurement Uncertainty

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  14. Green business will remain green

    Marcan, P.

    2008-01-01

    It all started with two words. Climate change. The carbon dioxide trading scheme, which was the politicians' idea on solving the number one global problem, followed. Four years ago, when the project was begun, there was no data for project initiation. Quotas for polluters mainly from energy production and other energy demanding industries were distributed based on spreadsheets, maximum output and expected future development of economies. Slovak companies have had a chance to profit from these arrangements since 2005. Many of them took advantage of the situation and turned the excessive quotas into an extraordinary profit which often reached hundreds of million Sk. The fact that the price of free quotas offered for sale dropped basically to 0 in 2006 only proved that the initial distribution was too generous. And the market reacted to the first official measurements of emissions. Slovak companies also contributed to this development. However, when planning the maximum emission volumes for 2008-2012 period, in spite of the fact that actual data were available, their expectations were not realistic. A glance at the figures in the proposal of the Ministry of Environment is sufficient to realize that there will be no major change in the future. And so for many Slovak companies business with a green future will remain green for the next five years. The state decided to give to selected companies even more free space as far as emissions are concerned. The most privileged companies can expect quotas increased by tens of percent. (author)

  15. Visualizing Uncertainty of Point Phenomena by Redesigned Error Ellipses

    Murphy, Christian E.

    2018-05-01

    Visualizing uncertainty remains one of the great challenges in modern cartography. There is no overarching strategy to display the nature of uncertainty, as an effective and efficient visualization depends, besides on the spatial data feature type, heavily on the type of uncertainty. This work presents a design strategy to visualize uncertainty con-nected to point features. The error ellipse, well-known from mathematical statistics, is adapted to display the uncer-tainty of point information originating from spatial generalization. Modified designs of the error ellipse show the po-tential of quantitative and qualitative symbolization and simultaneous point based uncertainty symbolization. The user can intuitively depict the centers of gravity, the major orientation of the point arrays as well as estimate the ex-tents and possible spatial distributions of multiple point phenomena. The error ellipse represents uncertainty in an intuitive way, particularly suitable for laymen. Furthermore it is shown how applicable an adapted design of the er-ror ellipse is to display the uncertainty of point features originating from incomplete data. The suitability of the error ellipse to display the uncertainty of point information is demonstrated within two showcases: (1) the analysis of formations of association football players, and (2) uncertain positioning of events on maps for the media.

  16. Uncertainty theory

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  17. Pandemic influenza: certain uncertainties

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  18. Teaching Uncertainties

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  19. Calibration uncertainty

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  20. Demand Uncertainty

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  1. Justification for recommended uncertainties

    Pronyaev, V.G.; Badikov, S.A.; Carlson, A.D.

    2007-01-01

    The uncertainties obtained in an earlier standards evaluation were considered to be unrealistically low by experts of the US Cross Section Evaluation Working Group (CSEWG). Therefore, the CSEWG Standards Subcommittee replaced the covariance matrices of evaluated uncertainties by expanded percentage errors that were assigned to the data over wide energy groups. There are a number of reasons that might lead to low uncertainties of the evaluated data: Underestimation of the correlations existing between the results of different measurements; The presence of unrecognized systematic uncertainties in the experimental data can lead to biases in the evaluated data as well as to underestimations of the resulting uncertainties; Uncertainties for correlated data cannot only be characterized by percentage uncertainties or variances. Covariances between evaluated value at 0.2 MeV and other points obtained in model (RAC R matrix and PADE2 analytical expansion) and non-model (GMA) fits of the 6 Li(n,t) TEST1 data and the correlation coefficients are presented and covariances between the evaluated value at 0.045 MeV and other points (along the line or column of the matrix) as obtained in EDA and RAC R matrix fits of the data available for reactions that pass through the formation of the 7 Li system are discussed. The GMA fit with the GMA database is shown for comparison. The following diagrams are discussed: Percentage uncertainties of the evaluated cross section for the 6 Li(n,t) reaction and the for the 235 U(n,f) reaction; estimation given by CSEWG experts; GMA result with full GMA database, including experimental data for the 6 Li(n,t), 6 Li(n,n) and 6 Li(n,total) reactions; uncertainties in the GMA combined fit for the standards; EDA and RAC R matrix results, respectively. Uncertainties of absolute and 252 Cf fission spectrum averaged cross section measurements, and deviations between measured and evaluated values for 235 U(n,f) cross-sections in the neutron energy range 1

  2. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  3. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  4. Photometric Uncertainties

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  5. Decomposition Technique for Remaining Useful Life Prediction

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor); Saxena, Abhinav (Inventor); Celaya, Jose R. (Inventor)

    2014-01-01

    The prognostic tool disclosed here decomposes the problem of estimating the remaining useful life (RUL) of a component or sub-system into two separate regression problems: the feature-to-damage mapping and the operational conditions-to-damage-rate mapping. These maps are initially generated in off-line mode. One or more regression algorithms are used to generate each of these maps from measurements (and features derived from these), operational conditions, and ground truth information. This decomposition technique allows for the explicit quantification and management of different sources of uncertainty present in the process. Next, the maps are used in an on-line mode where run-time data (sensor measurements and operational conditions) are used in conjunction with the maps generated in off-line mode to estimate both current damage state as well as future damage accumulation. Remaining life is computed by subtracting the instance when the extrapolated damage reaches the failure threshold from the instance when the prediction is made.

  6. Uncertainty analysis

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  7. And the Dead Remain Behind

    Peter Read

    2013-08-01

    Full Text Available In most cultures the dead and their living relatives are held in a dialogic relationship. The dead have made it clear, while living, what they expect from their descendants. The living, for their part, wish to honour the tombs of their ancestors; at the least, to keep the graves of the recent dead from disrepair. Despite the strictures, the living can fail their responsibilities, for example, by migration to foreign countries. The peripatetic Chinese are one of the few cultures able to overcome the dilemma of the wanderer or the exile. With the help of a priest, an Australian Chinese migrant may summon the soul of an ancestor from an Asian grave to a Melbourne temple, where the spirit, though removed from its earthly vessel, will rest and remain at peace. Amongst cultures in which such practices are not culturally appropriate, to fail to honour the family dead can be exquisitely painful. Violence is the cause of most failure.

  8. Uncertainty in artificial intelligence

    Shachter, RD; Henrion, M; Lemmer, JF

    1990-01-01

    This volume, like its predecessors, reflects the cutting edge of research on the automation of reasoning under uncertainty.A more pragmatic emphasis is evident, for although some papers address fundamental issues, the majority address practical issues. Topics include the relations between alternative formalisms (including possibilistic reasoning), Dempster-Shafer belief functions, non-monotonic reasoning, Bayesian and decision theoretic schemes, and new inference techniques for belief nets. New techniques are applied to important problems in medicine, vision, robotics, and natural language und

  9. Red Assembly: the work remains

    Leslie Witz

    installed. What to do at this limit, at the transgressive encounter between saying yes and no to history, remains the challenge. It is the very challenge of what insistently remains.

  10. Silicon photonics: some remaining challenges

    Reed, G. T.; Topley, R.; Khokhar, A. Z.; Thompson, D. J.; Stanković, S.; Reynolds, S.; Chen, X.; Soper, N.; Mitchell, C. J.; Hu, Y.; Shen, L.; Martinez-Jimenez, G.; Healy, N.; Mailis, S.; Peacock, A. C.; Nedeljkovic, M.; Gardes, F. Y.; Soler Penades, J.; Alonso-Ramos, C.; Ortega-Monux, A.; Wanguemert-Perez, G.; Molina-Fernandez, I.; Cheben, P.; Mashanovich, G. Z.

    2016-03-01

    This paper discusses some of the remaining challenges for silicon photonics, and how we at Southampton University have approached some of them. Despite phenomenal advances in the field of Silicon Photonics, there are a number of areas that still require development. For short to medium reach applications, there is a need to improve the power consumption of photonic circuits such that inter-chip, and perhaps intra-chip applications are viable. This means that yet smaller devices are required as well as thermally stable devices, and multiple wavelength channels. In turn this demands smaller, more efficient modulators, athermal circuits, and improved wavelength division multiplexers. The debate continues as to whether on-chip lasers are necessary for all applications, but an efficient low cost laser would benefit many applications. Multi-layer photonics offers the possibility of increasing the complexity and effectiveness of a given area of chip real estate, but it is a demanding challenge. Low cost packaging (in particular, passive alignment of fibre to waveguide), and effective wafer scale testing strategies, are also essential for mass market applications. Whilst solutions to these challenges would enhance most applications, a derivative technology is emerging, that of Mid Infra-Red (MIR) silicon photonics. This field will build on existing developments, but will require key enhancements to facilitate functionality at longer wavelengths. In common with mainstream silicon photonics, significant developments have been made, but there is still much left to do. Here we summarise some of our recent work towards wafer scale testing, passive alignment, multiplexing, and MIR silicon photonics technology.

  11. Uncertainty in adaptive capacity

    Neil Adger, W.; Vincent, K.

    2005-01-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. (authors)

  12. Uncertainties in fission-product decay-heat calculations

    Oyamatsu, K.; Ohta, H.; Miyazono, T.; Tasaka, K. [Nagoya Univ. (Japan)

    1997-03-01

    The present precision of the aggregate decay heat calculations is studied quantitatively for 50 fissioning systems. In this evaluation, nuclear data and their uncertainty data are taken from ENDF/B-VI nuclear data library and those which are not available in this library are supplemented by a theoretical consideration. An approximate method is proposed to simplify the evaluation of the uncertainties in the aggregate decay heat calculations so that we can point out easily nuclei which cause large uncertainties in the calculated decay heat values. In this paper, we attempt to clarify the justification of the approximation which was not very clear at the early stage of the study. We find that the aggregate decay heat uncertainties for minor actinides such as Am and Cm isotopes are 3-5 times as large as those for {sup 235}U and {sup 239}Pu. The recommended values by Atomic Energy Society of Japan (AESJ) were given for 3 major fissioning systems, {sup 235}U(t), {sup 239}Pu(t) and {sup 238}U(f). The present results are consistent with the AESJ values for these systems although the two evaluations used different nuclear data libraries and approximations. Therefore, the present results can also be considered to supplement the uncertainty values for the remaining 17 fissioning systems in JNDC2, which were not treated in the AESJ evaluation. Furthermore, we attempt to list nuclear data which cause large uncertainties in decay heat calculations for the future revision of decay and yield data libraries. (author)

  13. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    Vicari Kristin J

    2012-04-01

    Full Text Available Abstract Background Cost-effective production of lignocellulosic biofuels remains a major financial and technical challenge at the industrial scale. A critical tool in biofuels process development is the techno-economic (TE model, which calculates biofuel production costs using a process model and an economic model. The process model solves mass and energy balances for each unit, and the economic model estimates capital and operating costs from the process model based on economic assumptions. The process model inputs include experimental data on the feedstock composition and intermediate product yields for each unit. These experimental yield data are calculated from primary measurements. Uncertainty in these primary measurements is propagated to the calculated yields, to the process model, and ultimately to the economic model. Thus, outputs of the TE model have a minimum uncertainty associated with the uncertainty in the primary measurements. Results We calculate the uncertainty in the Minimum Ethanol Selling Price (MESP estimate for lignocellulosic ethanol production via a biochemical conversion process: dilute sulfuric acid pretreatment of corn stover followed by enzymatic hydrolysis and co-fermentation of the resulting sugars to ethanol. We perform a sensitivity analysis on the TE model and identify the feedstock composition and conversion yields from three unit operations (xylose from pretreatment, glucose from enzymatic hydrolysis, and ethanol from fermentation as the most important variables. The uncertainty in the pretreatment xylose yield arises from multiple measurements, whereas the glucose and ethanol yields from enzymatic hydrolysis and fermentation, respectively, are dominated by a single measurement: the fraction of insoluble solids (fIS in the biomass slurries. Conclusions We calculate a $0.15/gal uncertainty in MESP from the TE model due to uncertainties in primary measurements. This result sets a lower bound on the error bars of

  14. Uncertainties in Nuclear Proliferation Modeling

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  15. Uncertainty and measurement

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  16. Risk uncertainty analysis methods for NUREG-1150

    Benjamin, A.S.; Boyd, G.J.

    1987-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives

  17. Capital flight and the uncertainty of government policies

    Hermes, N.; Lensink, R.

    2000-01-01

    This paper shows that policy uncertainty, measured by the uncertainty of budget deficits, tax payments, government consumption and the inflation rate, has a statistically significant positive impact on capital flight. This result remains robust after having applied stability tests.

  18. The uncertainties in estimating measurement uncertainties

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  19. Uncertainty propagation through dynamic models of assemblies of mechanical structures

    Daouk, Sami

    2016-01-01

    When studying the behaviour of mechanical systems, mathematical models and structural parameters are usually considered deterministic. Return on experience shows however that these elements are uncertain in most cases, due to natural variability or lack of knowledge. Therefore, quantifying the quality and reliability of the numerical model of an industrial assembly remains a major question in low-frequency dynamics. The purpose of this thesis is to improve the vibratory design of bolted assemblies through setting up a dynamic connector model that takes account of different types and sources of uncertainty on stiffness parameters, in a simple, efficient and exploitable in industrial context. This work has been carried out in the framework of the SICODYN project, led by EDF R and D, that aims to characterise and quantify, numerically and experimentally, the uncertainties in the dynamic behaviour of bolted industrial assemblies. Comparative studies of several numerical methods of uncertainty propagation demonstrate the advantage of using the Lack-Of-Knowledge theory. An experimental characterisation of uncertainties in bolted structures is performed on a dynamic test rig and on an industrial assembly. The propagation of many small and large uncertainties through different dynamic models of mechanical assemblies leads to the assessment of the efficiency of the Lack-Of-Knowledge theory and its applicability in an industrial environment. (author)

  20. Uncertainties in environmental radiological assessment models and their implications

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  1. Model uncertainty in growth empirics

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high

  2. Evacuation decision-making: process and uncertainty

    Mileti, D.; Sorensen, J.; Bogard, W.

    1985-09-01

    The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs

  3. Uncertainty in social dilemmas

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  4. Does hypertension remain after kidney transplantation?

    Gholamreza Pourmand

    2015-05-01

    Full Text Available Hypertension is a common complication of kidney transplantation with the prevalence of 80%. Studies in adults have shown a high prevalence of hypertension (HTN in the first three months of transplantation while this rate is reduced to 50- 60% at the end of the first year. HTN remains as a major risk factor for cardiovascular diseases, lower graft survival rates and poor function of transplanted kidney in adults and children. In this retrospective study, medical records of 400 kidney transplantation patients of Sina Hospital were evaluated. Patients were followed monthly for the 1st year, every two months in the 2nd year and every three months after that. In this study 244 (61% patients were male. Mean ± SD age of recipients was 39.3 ± 13.8 years. In most patients (40.8% the cause of end-stage renal disease (ESRD was unknown followed by HTN (26.3%. A total of 166 (41.5% patients had been hypertensive before transplantation and 234 (58.5% had normal blood pressure. Among these 234 individuals, 94 (40.2% developed post-transplantation HTN. On the other hand, among 166 pre-transplant hypertensive patients, 86 patients (56.8% remained hypertensive after transplantation. Totally 180 (45% patients had post-transplantation HTN and 220 patients (55% didn't develop HTN. Based on the findings, the incidence of post-transplantation hypertension is high, and kidney transplantation does not lead to remission of hypertension. On the other hand, hypertension is one of the main causes of ESRD. Thus, early screening of hypertension can prevent kidney damage and reduce further problems in renal transplant recipients.

  5. Risk, unexpected uncertainty, and estimation uncertainty: Bayesian learning in unstable settings.

    Elise Payzan-LeNestour

    Full Text Available Recently, evidence has emerged that humans approach learning using Bayesian updating rather than (model-free reinforcement algorithms in a six-arm restless bandit problem. Here, we investigate what this implies for human appreciation of uncertainty. In our task, a Bayesian learner distinguishes three equally salient levels of uncertainty. First, the Bayesian perceives irreducible uncertainty or risk: even knowing the payoff probabilities of a given arm, the outcome remains uncertain. Second, there is (parameter estimation uncertainty or ambiguity: payoff probabilities are unknown and need to be estimated. Third, the outcome probabilities of the arms change: the sudden jumps are referred to as unexpected uncertainty. We document how the three levels of uncertainty evolved during the course of our experiment and how it affected the learning rate. We then zoom in on estimation uncertainty, which has been suggested to be a driving force in exploration, in spite of evidence of widespread aversion to ambiguity. Our data corroborate the latter. We discuss neural evidence that foreshadowed the ability of humans to distinguish between the three levels of uncertainty. Finally, we investigate the boundaries of human capacity to implement Bayesian learning. We repeat the experiment with different instructions, reflecting varying levels of structural uncertainty. Under this fourth notion of uncertainty, choices were no better explained by Bayesian updating than by (model-free reinforcement learning. Exit questionnaires revealed that participants remained unaware of the presence of unexpected uncertainty and failed to acquire the right model with which to implement Bayesian updating.

  6. Uncertainty Einstein, Heisenberg, Bohr, and the struggle for the soul of science

    Lindley, David

    2007-01-01

    The uncertainty in this delightful book refers to Heisenberg's Uncertainty Principle, an idea first postulated in 1927 by physicist Werner Heisenberg in his attempt to make sense out of the developing field of quantum mechanics. As Lindley so well explains it, the concept of uncertainty shook the philosophical underpinnings of science. It was Heisenberg's work that, to a great extent, kept Einstein from accepting quantum mechanics as a full explanation for physical reality. Similarly, it was the Uncertainty Principle that demonstrated the limits of scientific investigation: if Heisenberg is correct there are some aspects of the physical universe that are to remain beyond the reach of scientists. As he has done expertly in books like Boltzmann's Atom, Lindley brings to life a critical period in the history of science, explaining complex issues to the general reader, presenting the major players in an engaging fashion, delving into the process of scientific discovery and discussing the interaction between scien...

  7. Decision making under uncertainty

    Cyert, R.M.

    1989-01-01

    This paper reports on ways of improving the reliability of products and systems in this country if we are to survive as a first-rate industrial power. The use of statistical techniques have, since the 1920s, been viewed as one of the methods for testing quality and estimating the level of quality in a universe of output. Statistical quality control is not relevant, generally, to improving systems in an industry like yours, but certainly the use of probability concepts is of significance. In addition, when it is recognized that part of the problem involves making decisions under uncertainty, it becomes clear that techniques such as sequential decision making and Bayesian analysis become major methodological approaches that must be utilized

  8. Space Acquisitions: Some Programs Have Overcome Past Problems, but Challenges and Uncertainty Remain for the Future

    2015-04-29

    24, 2013. Launch Services New Entrant Certification Guide. GAO-13-317R. Washington, D.C.: February 7, 2013. Satellite Control: Long-Term Planning...catches up to the functionality of the satellites. • While new missile warning satellites are now on orbit after years of delays and significant cost...growing threats to space systems have led DOD to consider alternatives such as disaggregating—or breaking up—large satellites into multiple, smaller

  9. An audit of the global carbon budget: identifying and reducing sources of uncertainty

    Ballantyne, A. P.; Tans, P. P.; Marland, G.; Stocker, B. D.

    2012-12-01

    Uncertainties in our carbon accounting practices may limit our ability to objectively verify emission reductions on regional scales. Furthermore uncertainties in the global C budget must be reduced to benchmark Earth System Models that incorporate carbon-climate interactions. Here we present an audit of the global C budget where we try to identify sources of uncertainty for major terms in the global C budget. The atmospheric growth rate of CO2 has increased significantly over the last 50 years, while the uncertainty in calculating the global atmospheric growth rate has been reduced from 0.4 ppm/yr to 0.2 ppm/yr (95% confidence). Although we have greatly reduced global CO2 growth rate uncertainties, there remain regions, such as the Southern Hemisphere, Tropics and Arctic, where changes in regional sources/sinks will remain difficult to detect without additional observations. Increases in fossil fuel (FF) emissions are the primary factor driving the increase in global CO2 growth rate; however, our confidence in FF emission estimates has actually gone down. Based on a comparison of multiple estimates, FF emissions have increased from 2.45 ± 0.12 PgC/yr in 1959 to 9.40 ± 0.66 PgC/yr in 2010. Major sources of increasing FF emission uncertainty are increased emissions from emerging economies, such as China and India, as well as subtle differences in accounting practices. Lastly, we evaluate emission estimates from Land Use Change (LUC). Although relative errors in emission estimates from LUC are quite high (2 sigma ~ 50%), LUC emissions have remained fairly constant in recent decades. We evaluate the three commonly used approaches to estimating LUC emissions- Bookkeeping, Satellite Imagery, and Model Simulations- to identify their main sources of error and their ability to detect net emissions from LUC.; Uncertainties in Fossil Fuel Emissions over the last 50 years.

  10. Instrument uncertainty predictions

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  11. Incorporating Forecast Uncertainty in Utility Control Center

    Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian

    2014-07-09

    Uncertainties in forecasting the output of intermittent resources such as wind and solar generation, as well as system loads are not adequately reflected in existing industry-grade tools used for transmission system management, generation commitment, dispatch and market operation. There are other sources of uncertainty such as uninstructed deviations of conventional generators from their dispatch set points, generator forced outages and failures to start up, load drops, losses of major transmission facilities and frequency variation. These uncertainties can cause deviations from the system balance, which sometimes require inefficient and costly last minute solutions in the near real-time timeframe. This Chapter considers sources of uncertainty and variability, overall system uncertainty model, a possible plan for transition from deterministic to probabilistic methods in planning and operations, and two examples of uncertainty-based fools for grid operations.This chapter is based on work conducted at the Pacific Northwest National Laboratory (PNNL)

  12. Uncertainty analysis guide

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  13. Uncertainty analysis guide

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  14. Uncertainty and Cognitive Control

    Faisal eMushtaq

    2011-10-01

    Full Text Available A growing trend of neuroimaging, behavioural and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1 There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2 There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3 The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the need for control; (4 Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  15. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G; Ring, David; Parisien, Robert

    2016-06-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if other factors are involved. With added experience, uncertainty could be expected to diminish, but perhaps more influential are things like physician confidence, belief in the veracity of what is published, and even one's religious beliefs. In addition, it is plausible that the kind of practice a physician works in can affect the experience of uncertainty. Practicing physicians may not be immediately aware of these effects on how uncertainty is experienced in their clinical decision-making. We asked: (1) Does uncertainty and overconfidence bias decrease with years of practice? (2) What sociodemographic factors are independently associated with less recognition of uncertainty, in particular belief in God or other deity or deities, and how is atheism associated with recognition of uncertainty? (3) Do confidence bias (confidence that one's skill is greater than it actually is), degree of trust in the orthopaedic evidence, and degree of statistical sophistication correlate independently with recognition of uncertainty? We created a survey to establish an overall recognition of uncertainty score (four questions), trust in the orthopaedic evidence base (four questions), confidence bias (three questions), and statistical understanding (six questions). Seven hundred six members of the Science of Variation Group, a collaboration that aims to study variation in the definition and treatment of human illness, were approached to complete our survey. This group represents mainly orthopaedic surgeons specializing in trauma or hand and wrist surgery, practicing in Europe and North America, of whom the majority is involved in teaching. Approximately half of the group has more than 10 years

  16. Decision Under Uncertainty in Diagnosis

    Kalme, Charles I.

    2013-01-01

    This paper describes the incorporation of uncertainty in diagnostic reasoning based on the set covering model of Reggia et. al. extended to what in the Artificial Intelligence dichotomy between deep and compiled (shallow, surface) knowledge based diagnosis may be viewed as the generic form at the compiled end of the spectrum. A major undercurrent in this is advocating the need for a strong underlying model and an integrated set of support tools for carrying such a model in order to deal with ...

  17. Major depression

    Depression - major; Depression - clinical; Clinical depression; Unipolar depression; Major depressive disorder ... providers do not know the exact causes of depression. It is believed that chemical changes in the ...

  18. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  19. DS02 uncertainty analysis

    Kaul, Dean C.; Egbert, Stephen D.; Woolson, William A.

    2005-01-01

    In order to avoid the pitfalls that so discredited DS86 and its uncertainty estimates, and to provide DS02 uncertainties that are both defensible and credible, this report not only presents the ensemble uncertainties assembled from uncertainties in individual computational elements and radiation dose components but also describes how these relate to comparisons between observed and computed quantities at critical intervals in the computational process. These comparisons include those between observed and calculated radiation free-field components, where observations include thermal- and fast-neutron activation and gamma-ray thermoluminescence, which are relevant to the estimated systematic uncertainty for DS02. The comparisons also include those between calculated and observed survivor shielding, where the observations consist of biodosimetric measurements for individual survivors, which are relevant to the estimated random uncertainty for DS02. (J.P.N.)

  20. Managing Innovation In View Of The Uncertainties

    Anton Igorevich Mosalev

    2012-12-01

    Full Text Available Study of the problems of uncertainty in innovation is at present the most up to date. Approaches to its definition, arranged primarily on the assumption and include the known parameters, which essentially is a game approach to the assessment. Address specific issues of governance of innovation in accounting uncertainty still remains open and the most relevant, especially when the innovation represented by one of the drivers of growth of national economies. This paper presents a methodological approach to determining the degree of uncertainty and an approach to the management of innovation through a system of mathematical modeling on the criterion of gross errors.

  1. Model uncertainty and probability

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  2. Quantifying the uncertainty in heritability.

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  3. Uncertainties in hydrogen combustion

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  4. Uncertainty in hydrological signatures

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  5. Climate Certainties and Uncertainties

    Morel, Pierre

    2012-01-01

    In issue 380 of Futuribles in December 2011, Antonin Pottier analysed in detail the workings of what is today termed 'climate scepticism' - namely the propensity of certain individuals to contest the reality of climate change on the basis of pseudo-scientific arguments. He emphasized particularly that what fuels the debate on climate change is, largely, the degree of uncertainty inherent in the consequences to be anticipated from observation of the facts, not the description of the facts itself. In his view, the main aim of climate sceptics is to block the political measures for combating climate change. However, since they do not admit to this political posture, they choose instead to deny the scientific reality. This month, Futuribles complements this socio-psychological analysis of climate-sceptical discourse with an - in this case, wholly scientific - analysis of what we know (or do not know) about climate change on our planet. Pierre Morel gives a detailed account of the state of our knowledge in the climate field and what we are able to predict in the medium/long-term. After reminding us of the influence of atmospheric meteorological processes on the climate, he specifies the extent of global warming observed since 1850 and the main origin of that warming, as revealed by the current state of knowledge: the increase in the concentration of greenhouse gases. He then describes the changes in meteorological regimes (showing also the limits of climate simulation models), the modifications of hydrological regimes, and also the prospects for rises in sea levels. He also specifies the mechanisms that may potentially amplify all these phenomena and the climate disasters that might ensue. Lastly, he shows what are the scientific data that cannot be disregarded, the consequences of which are now inescapable (melting of the ice-caps, rises in sea level etc.), the only remaining uncertainty in this connection being the date at which these things will happen. 'In this

  6. Accounting for uncertainty in marine reserve design.

    Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A

    2006-01-01

    Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.

  7. Fish remains and humankind: part two

    Andrew K G Jones

    1998-07-01

    Full Text Available The significance of aquatic resources to past human groups is not adequately reflected in the published literature - a deficiency which is gradually being acknowledged by the archaeological community world-wide. The publication of the following three papers goes some way to redress this problem. Originally presented at an International Council of Archaeozoology (ICAZ Fish Remains Working Group meeting in York, U.K. in 1987, these papers offer clear evidence of the range of interest in ancient fish remains across the world. Further papers from the York meeting were published in Internet Archaeology 3 in 1997.

  8. Uncertainty in social dilemmas

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size

  9. Deterministic uncertainty analysis

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  10. Uncertainty and simulation

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  11. Physical Uncertainty Bounds (PUB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  12. Measurement uncertainty and probability

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  13. Why Agricultural Educators Remain in the Classroom

    Crutchfield, Nina; Ritz, Rudy; Burris, Scott

    2013-01-01

    The purpose of this study was to identify and describe factors that are related to agricultural educator career retention and to explore the relationships between work engagement, work-life balance, occupational commitment, and personal and career factors as related to the decision to remain in the teaching profession. The target population for…

  14. Juveniles' Motivations for Remaining in Prostitution

    Hwang, Shu-Ling; Bedford, Olwen

    2004-01-01

    Qualitative data from in-depth interviews were collected in 1990-1991, 1992, and 2000 with 49 prostituted juveniles remanded to two rehabilitation centers in Taiwan. These data are analyzed to explore Taiwanese prostituted juveniles' feelings about themselves and their work, their motivations for remaining in prostitution, and their difficulties…

  15. Kadav Moun PSA (:60) (Human Remains)

    2010-02-18

    This is an important public health announcement about safety precautions for those handling human remains. Language: Haitian Creole.  Created: 2/18/2010 by Centers for Disease Control and Prevention (CDC).   Date Released: 2/18/2010.

  16. The Annuity Puzzle Remains a Puzzle

    Peijnenburg, J.M.J.; Werker, Bas; Nijman, Theo

    We examine incomplete annuity menus and background risk as possible drivers of divergence from full annuitization. Contrary to what is often suggested in the literature, we find that full annuitization remains optimal if saving is possible after retirement. This holds irrespective of whether real or

  17. Explosives remain preferred methods for platform abandonment

    Pulsipher, A.; Daniel, W. IV; Kiesler, J.E.; Mackey, V. III

    1996-01-01

    Economics and safety concerns indicate that methods involving explosives remain the most practical and cost-effective means for abandoning oil and gas structures in the Gulf of Mexico. A decade has passed since 51 dead sea turtles, many endangered Kemp's Ridleys, washed ashore on the Texas coast shortly after explosives helped remove several offshore platforms. Although no relationship between the explosions and the dead turtles was ever established, in response to widespread public concern, the US Minerals Management Service (MMS) and National Marine Fisheries Service (NMFS) implemented regulations limiting the size and timing of explosive charges. Also, more importantly, they required that operators pay for observers to survey waters surrounding platforms scheduled for removal for 48 hr before any detonations. If observers spot sea turtles or marine mammals within the danger zone, the platform abandonment is delayed until the turtles leave or are removed. However, concern about the effects of explosives on marine life remains

  18. Uncertainty Propagation in OMFIT

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  19. CD4 T cells remain the major source of HIV-1 during end stage disease.

    M.E. van der Ende (Marchina); M. Schutten (Martin); B. Raschdorff; G. Grosschupff; P. Racz; A.D.M.E. Osterhaus (Albert); K. Tenner-Racz

    1999-01-01

    textabstractOBJECTIVE: To assess the source of HIV-1 production in lymphoid tissue biopsies from HIV-infected patients, with no prior anti-retroviral protease inhibitor treatment, with a CD4 cell count > 150 x 10(6)/l (group I) or < 50 x 10(6)/l (group II), co-infected with Mycobacterium

  20. The CRC 20 Years: An Overview of Some of the Major Achievements and Remaining Challenges

    Doek, Jaap E.

    2009-01-01

    On 20 November 1989, the General Assembly of the United Nations adopted the Convention on the Rights of the Child (CRC). It entered into force on 2 September 1990 and has by now been ratified by 193 States, making the most universally ratified human rights treaty. This overview will present and discuss the impact of this treaty both at the…

  1. Vaccine prevention of meningococcal disease in Africa: Major advances, remaining challenges.

    Mustapha, Mustapha M; Harrison, Lee H

    2017-12-06

    Africa historically has had the highest incidence of meningococcal disease with high endemic rates and periodic epidemics. The meningitis belt, a region of sub-Saharan Africa extending from Senegal to Ethiopia, has experienced large, devastating epidemics. However, dramatic shifts in the epidemiology of meningococcal disease have occurred recently. For instance, meningococcal capsular group A (NmA) epidemics in the meningitis belt have essentially been eliminated by use of conjugate vaccine. However, NmW epidemics have emerged and spread across the continent since 2000; NmX epidemics have occurred sporadically, and NmC recently emerged in Nigeria and Niger. Outside the meningitis belt, NmB predominates in North Africa, while NmW followed by NmB predominate in South Africa. Improved surveillance is necessary to address the challenges of this changing epidemiologic picture. A low-cost, multivalent conjugate vaccine covering NmA and the emergent and prevalent meningococcal capsular groups C, W, and X in the meningitis belt is a pressing need.

  2. Verification of uncertainty budgets

    Heydorn, Kaj; Madsen, B.S.

    2005-01-01

    , and therefore it is essential that the applicability of the overall uncertainty budget to actual measurement results be verified on the basis of current experimental data. This should be carried out by replicate analysis of samples taken in accordance with the definition of the measurand, but representing...... the full range of matrices and concentrations for which the budget is assumed to be valid. In this way the assumptions made in the uncertainty budget can be experimentally verified, both as regards sources of variability that are assumed negligible, and dominant uncertainty components. Agreement between...

  3. Hospital admission planning to optimize major resources utilization under uncertainty

    Dellaert, N.P.; Jeunet, J.

    2010-01-01

    Admission policies for elective inpatient services mainly result in the management of a single resource: the operating theatre as it is commonly considered as the most critical and expensive resource in a hospital. However, other bottleneck resources may lead to surgery cancellations, such as bed

  4. Hospital admission planning to optimize major resources utilization under uncertainty

    Dellaert, N.P.; Jeunet, J.

    2010-01-01

    Admission policies for elective inpatient services mainly result in the management of a single resource: the operating theatre as it is commonly considered as the most critical and expensive resource in a hospital. However, other bottleneck resources may lead to surgery cancellations, such as bed capacity and nursing staff in Intensive Care (IC) units and bed occupancy in wards or medium care (MC) services. Our incentive is therefore to determine a master schedule of a given number of patient...

  5. Evaluating prediction uncertainty

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  6. Uncertainty in oil projects

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  7. Uncertainties and climatic change

    De Gier, A.M.; Opschoor, J.B.; Van de Donk, W.B.H.J.; Hooimeijer, P.; Jepma, J.; Lelieveld, J.; Oerlemans, J.; Petersen, A.

    2008-01-01

    Which processes in the climate system are misunderstood? How are scientists dealing with uncertainty about climate change? What will be done with the conclusions of the recently published synthesis report of the IPCC? These and other questions were answered during the meeting 'Uncertainties and climate change' that was held on Monday 26 November 2007 at the KNAW in Amsterdam. This report is a compilation of all the presentations and provides some conclusions resulting from the discussions during this meeting. [mk] [nl

  8. Mechanics and uncertainty

    Lemaire, Maurice

    2014-01-01

    Science is a quest for certainty, but lack of certainty is the driving force behind all of its endeavors. This book, specifically, examines the uncertainty of technological and industrial science. Uncertainty and Mechanics studies the concepts of mechanical design in an uncertain setting and explains engineering techniques for inventing cost-effective products. Though it references practical applications, this is a book about ideas and potential advances in mechanical science.

  9. Uncertainty: lotteries and risk

    Ávalos, Eloy

    2011-01-01

    In this paper we develop the theory of uncertainty in a context where the risks assumed by the individual are measurable and manageable. We primarily use the definition of lottery to formulate the axioms of the individual's preferences, and its representation through the utility function von Neumann - Morgenstern. We study the expected utility theorem and its properties, the paradoxes of choice under uncertainty and finally the measures of risk aversion with monetary lotteries.

  10. Uncertainty calculations made easier

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  11. Communicating uncertainty media coverage of new and controversial science

    Dunwoody, Sharon; Rogers, Carol L

    1997-01-01

    This work, by the editors of "Scientists and Journalists: Reporting Science as News", explores scientific uncertainty and media coverage of it in such major public issues as AISA, biotechnology, dioxin, global warming, and nature vs. nurture. It examines the interrelations of the major actors in constructing and explaining uncertainty: scientists, journalists, scholars, and the larger public. Part 1 examines participants in the scientific uncertainty arena and how the major actors react to, cope with and manage uncertain issues. It also describes how scientists and journalists vie for control over uncertain science. The panel discussion at the end of this section is a spirited discourse on how they handle scientific uncertainty. Part 2 explores instances of scientific uncertainty in the public arena, highlighting studies involving uncertainty and biotechnology, dioxin, human resources for science, and human behaviour. The panel discussion concluding this section reacts to several of these specific issues and ...

  12. Industry remains stuck in a transitional mode

    Garb, F.A.

    1991-01-01

    The near future for industry remains foggy for several obvious reasons. The shake-up of the Soviet Union and how the pieces will reform remains unclear. How successful efforts are to privatize government oil company operations around the world has yet to be determined. A long sought peace in the Middle East seems to be inching closer, but will this continue? If it does continue, what impact will it have on world energy policy? Will American companies, which are now transferring their attention to foreign E and P, also maintain an interest in domestic activities? Is the U.S. economy really on the upswing? We are told that the worst of the recession is over, but try telling this to thousands of workers in the oil patch who are being released monthly by the big players in domestic operations. This paper reports that 1992 should be a better year than 1991, if measured in opportunity. There are more exploration and acquisition options available, both domestically and internationally, than there have been in years. Probably more opportunities exist than there are players-certainly more than can be funded with current financial resources

  13. US GAAP vs. IFRS – A COMPARISON OF REMAINING DIFFERENCES

    Mihelčić, Eva

    2008-01-01

    In spite of the on-going harmonization process, there are still some differences between US GAAP and IFRS. Currently, companies listed on the New York Stock Exchange, which are reporting according to IFRS, must still prepare the reconciliation to US GAAP, to show the financial statements compliant with US GAAP as well. This article presents an overview of the remaining major differences between US GAAP and IFRS, descriptive as well as table-wise. First, the standards compared are shortly intr...

  14. Evacuation decision-making: process and uncertainty

    Mileti, D.; Sorensen, J.; Bogard, W.

    1985-09-01

    The purpose was to describe the processes of evacuation decision-making, identify and document uncertainties in that process and discuss implications for federal assumption of liability for precautionary evacuations at nuclear facilities under the Price-Anderson Act. Four major categories of uncertainty are identified concerning the interpretation of hazard, communication problems, perceived impacts of evacuation decisions and exogenous influences. Over 40 historical accounts are reviewed and cases of these uncertainties are documented. The major findings are that all levels of government, including federal agencies experience uncertainties in some evacuation situations. Second, private sector organizations are subject to uncertainties at a variety of decision points. Third, uncertainties documented in the historical record have provided the grounds for liability although few legal actions have ensued. Finally it is concluded that if liability for evacuations is assumed by the federal government, the concept of a ''precautionary'' evacuation is not useful in establishing criteria for that assumption. 55 refs., 1 fig., 4 tabs.

  15. Uncertainty and global climate change research

    Tonn, B.E. [Oak Ridge National Lab., TN (United States); Weiher, R. [National Oceanic and Atmospheric Administration, Boulder, CO (United States)

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  16. Major Links.

    Henderson, Tona

    1995-01-01

    Provides electronic mail addresses for resources and discussion groups related to the following academic majors: art, biology, business, chemistry, computer science, economics, health sciences, history, literature, math, music, philosophy, political science, psychology, sociology, and theater. (AEF)

  17. Major Roads

    Minnesota Department of Natural Resources — This data set contains roadway centerlines for major roads (interstates and trunk highways) found on the USGS 1:24,000 mapping series. These roadways are current...

  18. Quantifying uncertainty in LCA-modelling of waste management systems

    Clavreul, Julie; Guyonnet, D.; Christensen, Thomas Højlund

    2012-01-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present...... the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining...

  19. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Elert, M.

    1996-09-01

    deterministic case, and the uncertainty bands did not always overlap. This suggest that there are considerable model uncertainties present, which were not considered in this study. Concerning possible constraints in the application domain of different models, the results of this exercise suggest that if only the evolution of the root zone concentration is to be predicted, all of the studied models give comparable results. However, if also the flux to the groundwater is to be predicted, then a considerably increased amount of detail is needed concerning the model and the parameterization. This applies to the hydrological as well as the transport modelling. The difference in model predictions and the magnitude of uncertainty was quite small for some of the end-points predicted, while for others it could span many orders of magnitude. Of special importance were end-points where delay in the soil was involved, e.g. release to the groundwater. In such cases the influence of radioactive decay gave rise to strongly non-linear effects. The work in the subgroup has provided many valuable insights on the effects of model simplifications, e.g. discretization in the model, averaging of the time varying input parameters and the assignment of uncertainties to parameters. The conclusions that have been drawn concerning these are primarily valid for the studied scenario. However, we believe that they to a large extent also are generally applicable. The subgroup have had many opportunities to study the pitfalls involved in model comparison. The intention was to provide a well defined scenario for the subgroup, but despite several iterations misunderstandings and ambiguities remained. The participants have been forced to scrutinize their models to try to explain differences in the predictions and most, if not all, of the participants have improved their models as a result of this

  20. Systematic uncertainties in direct reaction theories

    Lovell, A E; Nunes, F M

    2015-01-01

    Nuclear reactions are common probes to study nuclei and in particular, nuclei at the limits of stability. The data from reaction measurements depend strongly on theory for a reliable interpretation. Even when using state-of-the-art reaction theories, there are a number of sources of systematic uncertainties. These uncertainties are often unquantified or estimated in a very crude manner. It is clear that for theory to be useful, a quantitative understanding of the uncertainties is critical. Here, we discuss major sources of uncertainties in a variety of reaction theories used to analyze (d,p) nuclear reactions in the energy range E d = 10–20 MeV, and we provide a critical view on how these have been handled in the past and how estimates can be improved. (paper)

  1. Shotgun microbial profiling of fossil remains

    Der Sarkissian, Clio; Ermini, Luca; Jónsson, Hákon

    2014-01-01

    the specimen of interest, but instead reflect environmental organisms that colonized the specimen after death. Here, we characterize the microbial diversity recovered from seven c. 200- to 13 000-year-old horse bones collected from northern Siberia. We use a robust, taxonomy-based assignment approach...... to identify the microorganisms present in ancient DNA extracts and quantify their relative abundance. Our results suggest that molecular preservation niches exist within ancient samples that can potentially be used to characterize the environments from which the remains are recovered. In addition, microbial...... community profiling of the seven specimens revealed site-specific environmental signatures. These microbial communities appear to comprise mainly organisms that colonized the fossils recently. Our approach significantly extends the amount of useful data that can be recovered from ancient specimens using...

  2. Some remaining problems in HCDA analysis

    Chang, Y.W.

    1981-01-01

    The safety assessment and licensing of liquid-metal fast breeder reactors (LMFBRs) requires an analysis on the capability of the reactor primary system to sustain the consequences of a hypothetical core-disruptive accident (HCDA). Although computational methods and computer programs developed for HCDA analyses can predict reasonably well the response of the primary containment system, and follow up the phenomena of HCDA from the start of excursion to the time of dynamic equilibrium in the system, there remain areas in the HCDA analysis that merit further analytical and experimental studies. These are the analysis of fluid impact on reactor cover, three-dimensional analysis, the treatment of the perforated plates, material properties under high strain rates and under high temperatures, the treatment of multifield flows, and the treatment of prestressed concrete reactor vessels. The purpose of this paper is to discuss the structural mechanics of HCDA analysis in these areas where improvements are needed

  3. Political, energy events will remain interwoven

    Jones, D.P.

    1991-01-01

    This paper reports that it is possible to discuss the significance of political and energy events separately, but, in truth, they are intricately interwoven. Furthermore, there are those who will argue that since the two are inseparable, the future is not predictable; so why bother in the endeavor. It is possible that the central point of the exercise may have been missed-yes, the future is unpredictable exclamation point However, the objective of prediction is secondary. The objective of understanding the dynamic forces of change is primary exclamation point With this view of recent history, it is perhaps appropriate to pause and think about the future of the petroleum industry. The future as shaped by political, energy, economic, environmental and technological forces will direct our lives and markets during this decade. Most importantly, what will be the direction that successful businesses take to remain competitive in a global environment? These are interesting issues worthy of provocative thoughts and innovative ideas

  4. Nuclear remains an economic and ecologic asset

    Le Ngoc, Boris

    2015-01-01

    The author herein outlines the several benefits of nuclear energy and nuclear industry for France. He first outlines that France possesses 97 per cent of de-carbonated electricity thanks to nuclear energy (77 pc) and renewable energies (20 pc, mainly hydraulic), and that renewable energies must be developed in the building and transport sectors to be able to get rid of the environmentally and financially costly fossil energies. He outlines that reactor maintenance and the nuclear fuel cycle industry are fields of technological leadership for the French nuclear industry which is, after motor industry and aircraft industry, the third industrial sector in France. He indicates that nuclear electricity is to remain the most competitive one, and that nuclear energy and renewable energies must not be opposed to it but considered as complementary in the struggle against climate change, i.e. to reduce greenhouse gas emissions and to get rid of the prevalence of fossil energies

  5. Population cycles: generalities, exceptions and remaining mysteries

    2018-01-01

    Population cycles are one of nature's great mysteries. For almost a hundred years, innumerable studies have probed the causes of cyclic dynamics in snowshoe hares, voles and lemmings, forest Lepidoptera and grouse. Even though cyclic species have very different life histories, similarities in mechanisms related to their dynamics are apparent. In addition to high reproductive rates and density-related mortality from predators, pathogens or parasitoids, other characteristics include transgenerational reduced reproduction and dispersal with increasing-peak densities, and genetic similarity among populations. Experiments to stop cyclic dynamics and comparisons of cyclic and noncyclic populations provide some understanding but both reproduction and mortality must be considered. What determines variation in amplitude and periodicity of population outbreaks remains a mystery. PMID:29563267

  6. The Human Remains from HMS Pandora

    D.P. Steptoe

    2002-04-01

    Full Text Available In 1977 the wreck of HMS Pandora (the ship that was sent to re-capture the Bounty mutineers was discovered off the north coast of Queensland. Since 1983, the Queensland Museum Maritime Archaeology section has carried out systematic excavation of the wreck. During the years 1986 and 1995-1998, more than 200 human bone and bone fragments were recovered. Osteological investigation revealed that this material represented three males. Their ages were estimated at approximately 17 +/-2 years, 22 +/-3 years and 28 +/-4 years, with statures of 168 +/-4cm, 167 +/-4cm, and 166cm +/-3cm respectively. All three individuals were probably Caucasian, although precise determination of ethnicity was not possible. In addition to poor dental hygiene, signs of chronic diseases suggestive of rickets and syphilis were observed. Evidence of spina bifida was seen on one of the skeletons, as were other skeletal anomalies. Various taphonomic processes affecting the remains were also observed and described. Compact bone was observed under the scanning electron microscope and found to be structurally coherent. Profiles of the three skeletons were compared with historical information about the 35 men lost with the ship, but no precise identification could be made. The investigation did not reveal the cause of death. Further research, such as DNA analysis, is being carried out at the time of publication.

  7. SMART POINT CLOUD: DEFINITION AND REMAINING CHALLENGES

    F. Poux

    2016-10-01

    Full Text Available Dealing with coloured point cloud acquired from terrestrial laser scanner, this paper identifies remaining challenges for a new data structure: the smart point cloud. This concept arises with the statement that massive and discretized spatial information from active remote sensing technology is often underused due to data mining limitations. The generalisation of point cloud data associated with the heterogeneity and temporality of such datasets is the main issue regarding structure, segmentation, classification, and interaction for an immediate understanding. We propose to use both point cloud properties and human knowledge through machine learning to rapidly extract pertinent information, using user-centered information (smart data rather than raw data. A review of feature detection, machine learning frameworks and database systems indexed both for mining queries and data visualisation is studied. Based on existing approaches, we propose a new 3-block flexible framework around device expertise, analytic expertise and domain base reflexion. This contribution serves as the first step for the realisation of a comprehensive smart point cloud data structure.

  8. Efforts to standardize wildlife toxicity values remain unrealized.

    Mayfield, David B; Fairbrother, Anne

    2013-01-01

    Wildlife toxicity reference values (TRVs) are routinely used during screening level and baseline ecological risk assessments (ERAs). Risk assessment professionals often adopt TRVs from published sources to expedite risk analyses. The US Environmental Protection Agency (USEPA) developed ecological soil screening levels (Eco-SSLs) to provide a source of TRVs that would improve consistency among risk assessments. We conducted a survey and evaluated more than 50 publicly available, large-scale ERAs published in the last decade to evaluate if USEPA's goal of uniformity in the use of wildlife TRVs has been met. In addition, these ERAs were reviewed to understand current practices for wildlife TRV use and development within the risk assessment community. The use of no observed and lowest observed adverse effect levels culled from published compendia was common practice among the majority of ERAs reviewed. We found increasing use over time of TRVs established in the Eco-SSL documents; however, Eco-SSL TRV values were not used in the majority of recent ERAs and there continues to be wide variation in TRVs for commonly studied contaminants (e.g., metals, pesticides, PAHs, and PCBs). Variability in the toxicity values was driven by differences in the key studies selected, dose estimation methods, and use of uncertainty factors. These differences result in TRVs that span multiple orders of magnitude for many of the chemicals examined. This lack of consistency in TRV development leads to highly variable results in ecological risk assessments conducted throughout the United States. Copyright © 2012 SETAC.

  9. Uncertainty modeling process for semantic technology

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  10. Dealing with exploration uncertainties

    Capen, E.

    1992-01-01

    Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side

  11. A review of uncertainty research in impact assessment

    Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.

    2015-01-01

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  12. A review of uncertainty research in impact assessment

    Leung, Wanda, E-mail: wanda.leung@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Noble, Bram, E-mail: b.noble@usask.ca [Department of Geography and Planning, School of Environment and Sustainability, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Gunn, Jill, E-mail: jill.gunn@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Jaeger, Jochen A.G., E-mail: jochen.jaeger@concordia.ca [Department of Geography, Planning and Environment, Concordia University, 1455 de Maisonneuve W., Suite 1255, Montreal, Quebec H3G 1M8 (Canada); Loyola Sustainability Research Centre, Concordia University, 7141 Sherbrooke W., AD-502, Montreal, Quebec H4B 1R6 (Canada)

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  13. What remains of the Arrow oil?

    Sergy, G.; Owens, E.

    1993-01-01

    In February 1970, the tanker Arrow became grounded 6.5 km off the north shore of Chedabucto Bay, Nova Scotia, and nearly 72,000 bbl of Bunker C fuel oil were released from the vessel during its subsequent breakup and sinking. The oil was washed ashore in various degrees over an estimated 305 km of the bay's 604-km shoreline, of which only 48 km were cleaned. In addition, the tanker Kurdistan broke in two in pack ice in March 1979 in the Cabot Strait area, spilling ca 54,000 bbl of Bunker C, some of which was later found at 16 locations along the northeast and east shorelines of Chedabucto Bay. In summer 1992, a systematic ground survey of the bay's shorelines was conducted using Environment Canada Shoreline Cleanup Assessment Team (SCAT) procedures. Standard observations were made of oil distribution and width, thickness, and character of the oil residues in 419 coastal segments. Results from the survey are summarized. Oil was found to be present on 13.3 km of the shoreline, with heavy oiling restricted to 1.3 km primarily in the areas of Black Duck Cove and Lennox Passage. Some of this residual oil was identified as coming from the Arrow. Natural weathering processes account for removal of most of the spilled oil from the bay. Oil remaining on the shore was found in areas outside of the zone of physical wave action, in areas of nearshore mixing where fine sediments are not present to weather the oil through biophysical processes, or in crusts formed by oil weathered on the surface. The systematic description of oiled shorelines using the SCAT methodology proved very successful, even for such an old spill. 6 refs

  14. Ghost Remains After Black Hole Eruption

    2009-05-01

    NASA's Chandra X-ray Observatory has found a cosmic "ghost" lurking around a distant supermassive black hole. This is the first detection of such a high-energy apparition, and scientists think it is evidence of a huge eruption produced by the black hole. This discovery presents astronomers with a valuable opportunity to observe phenomena that occurred when the Universe was very young. The X-ray ghost, so-called because a diffuse X-ray source has remained after other radiation from the outburst has died away, is in the Chandra Deep Field-North, one of the deepest X-ray images ever taken. The source, a.k.a. HDF 130, is over 10 billion light years away and existed at a time 3 billion years after the Big Bang, when galaxies and black holes were forming at a high rate. "We'd seen this fuzzy object a few years ago, but didn't realize until now that we were seeing a ghost", said Andy Fabian of the Cambridge University in the United Kingdom. "It's not out there to haunt us, rather it's telling us something - in this case what was happening in this galaxy billions of year ago." Fabian and colleagues think the X-ray glow from HDF 130 is evidence for a powerful outburst from its central black hole in the form of jets of energetic particles traveling at almost the speed of light. When the eruption was ongoing, it produced prodigious amounts of radio and X-radiation, but after several million years, the radio signal faded from view as the electrons radiated away their energy. HDF 130 Chandra X-ray Image of HDF 130 However, less energetic electrons can still produce X-rays by interacting with the pervasive sea of photons remaining from the Big Bang - the cosmic background radiation. Collisions between these electrons and the background photons can impart enough energy to the photons to boost them into the X-ray energy band. This process produces an extended X-ray source that lasts for another 30 million years or so. "This ghost tells us about the black hole's eruption long after

  15. Capital flight and the uncertainty of government policies

    Hermes, C.L.M.; Lensink, B.W.

    This paper shows that policy uncertainty, measured by the uncertainty of budget deficits, tart payments, government consumption and the inflation rate, has a statistically significant positive impact on capital flight. This result remains robust after having applied stability tests. (C) 2001

  16. Uncertainty in artificial intelligence

    Levitt, TS; Lemmer, JF; Shachter, RD

    1990-01-01

    Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally i

  17. Sensitivity and uncertainty analysis

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  18. Integrating uncertainties for climate change mitigation

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by

  19. Uncertainty quantification for environmental models

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  20. Uncertainties of the ultrasonic thickness gauging (UTTG)

    Mohamad Pauzi Ismail; Yassir Yassen; Amry Amin Abas

    2009-04-01

    The reliability of UTTG was questioned by a senior staff from DOSH in his paper presented during third NDT and Corrosion Management Asia Conference and Exhibition, 4-5 September 2007 at Istana Hotel, Kuala Lumpur. A term 'thickness grow' is an issue need to be solved by NDT community. The technique used by many practitioners gives rise to serious shortcoming in both probability of detection and accuracy of remaining wall assessment. This paper explained and discussed on uncertainty measurement based on the ISO Guide to the Expression of Uncertainty in Measurement (GUM) (1) of real UTTG data obtained from chemical industry. (author)

  1. Uncertainty in project phases: A framework for organisational change management

    Kreye, Melanie; Balangalibun, Sarah

    2015-01-01

    in the early stage of the change project but was delayed until later phases. Furthermore, the sources of uncertainty were found to be predominantly within the organisation that initiated the change project and connected to the project scope. Based on these findings, propositions for future research are defined......Uncertainty is an integral challenge when managing organisational change projects (OCPs). Current literature highlights the importance of uncertainty; however, falls short of giving insights into the nature of uncertainty and suggestions for managing it. Specifically, no insights exist on how...... uncertainty develops over the different phases of OCPs. This paper presents case-based evidence on different sources of uncertainty in OCPs and how these develop over the different project phases. The results showed some surprising findings as the majority of the uncertainty did not manifest itself...

  2. Uncertainty Analyses and Strategy

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  3. Predicting intolerance of uncertainty in individuals with eating disorder symptoms

    Sternheim, Lot C; Fisher, Martin; Harrison, Amy; Watling, Rosamond

    2017-01-01

    BACKGROUND: Intolerance of Uncertainty (IU) is recognized for its contribution to various psychopathologies, in particular anxiety and depression. Studies highlight the relevance of IU for Eating Disorders (EDs) however, potential factors contributing to IU in EDs remain unstudied. METHODS: Three

  4. Proposed standardized definitions for vertical resolution and uncertainty in the NDACC lidar ozone and temperature algorithms - Part 3: Temperature uncertainty budget

    Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Haefele, Alexander; Payen, Guillaume; Liberti, Gianluigi

    2016-08-01

    A standardized approach for the definition, propagation, and reporting of uncertainty in the temperature lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One important aspect of the proposed approach is the ability to propagate all independent uncertainty components in parallel through the data processing chain. The individual uncertainty components are then combined together at the very last stage of processing to form the temperature combined standard uncertainty. The identified uncertainty sources comprise major components such as signal detection, saturation correction, background noise extraction, temperature tie-on at the top of the profile, and absorption by ozone if working in the visible spectrum, as well as other components such as molecular extinction, the acceleration of gravity, and the molecular mass of air, whose magnitudes depend on the instrument, data processing algorithm, and altitude range of interest. The expression of the individual uncertainty components and their step-by-step propagation through the temperature data processing chain are thoroughly estimated, taking into account the effect of vertical filtering and the merging of multiple channels. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which means that covariance terms must be taken into account when vertical filtering is applied and when temperature is integrated from the top of the profile. Quantitatively, the uncertainty budget is presented in a generic form (i.e., as a function of instrument performance and wavelength), so that any NDACC temperature lidar investigator can easily estimate the expected impact of individual uncertainty components in the case of their own instrument. Using this standardized approach, an example of uncertainty budget is provided for the Jet Propulsion Laboratory (JPL) lidar at Mauna Loa Observatory, Hawai'i, which is

  5. Uncertainties in repository modeling

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  6. Uncertainties in repository modeling

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  7. Risks, uncertainty, vagueness

    Haefele, W.; Renn, O.; Erdmann, G.

    1990-01-01

    The notion of 'risk' is discussed in its social and technological contexts, leading to an investigation of the terms factuality, hypotheticality, uncertainty, and vagueness, and to the problems of acceptance and acceptability especially in the context of political decision finding. (DG) [de

  8. Strategy under uncertainty.

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  9. Tuberculosis remains a challenge despite economic growth in Panama.

    Tarajia, M; Goodridge, A

    2014-03-01

    Tuberculosis (TB) is a disease associated with inequality, and wise investment of economic resources is considered critical to its control. Panama has recently secured its status as an upper-middle-income country with robust economic growth. However, the prioritisation of resources for TB control remains a major challenge. In this article, we highlight areas that urgently require action to effectively reduce TB burden to minimal levels. Our conclusions suggest the need for fund allocation and a multidisciplinary approach to ensure prompt laboratory diagnosis, treatment assurance and workforce reinforcement, complemented by applied and operational research, development and innovation.

  10. Uncertainties in predicting solar panel power output

    Anspaugh, B.

    1974-01-01

    The problem of calculating solar panel power output at launch and during a space mission is considered. The major sources of uncertainty and error in predicting the post launch electrical performance of the panel are considered. A general discussion of error analysis is given. Examples of uncertainty calculations are included. A general method of calculating the effect on the panel of various degrading environments is presented, with references supplied for specific methods. A technique for sizing a solar panel for a required mission power profile is developed.

  11. Political uncertainty and firm risk in China

    Danglun Luo

    2017-12-01

    Full Text Available The political uncertainty surrounded by the turnover of government officials has a major impact on local economies and local firms. This paper investigates the relationship between the turnover of prefecture-city officials and the inherent risk faced by local firms in China. Using data from 1999 to 2012, we find that prefecture-city official turnovers significantly increased firm risk. Our results show that the political risk was mitigated when new prefecture-city officials were well connected with their provincial leaders. In addition, the impact of political uncertainty was more pronounced for regulated firms and firms residing in provinces with low market openness.

  12. Analysis of uncertainty in modeling perceived risks

    Melnyk, R.; Sandquist, G.M.

    2005-01-01

    Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)

  13. On the relationship between aerosol model uncertainty and radiative forcing uncertainty.

    Lee, Lindsay A; Reddington, Carly L; Carslaw, Kenneth S

    2016-05-24

    The largest uncertainty in the historical radiative forcing of climate is caused by the interaction of aerosols with clouds. Historical forcing is not a directly measurable quantity, so reliable assessments depend on the development of global models of aerosols and clouds that are well constrained by observations. However, there has been no systematic assessment of how reduction in the uncertainty of global aerosol models will feed through to the uncertainty in the predicted forcing. We use a global model perturbed parameter ensemble to show that tight observational constraint of aerosol concentrations in the model has a relatively small effect on the aerosol-related uncertainty in the calculated forcing between preindustrial and present-day periods. One factor is the low sensitivity of present-day aerosol to natural emissions that determine the preindustrial aerosol state. However, the major cause of the weak constraint is that the full uncertainty space of the model generates a large number of model variants that are equally acceptable compared to present-day aerosol observations. The narrow range of aerosol concentrations in the observationally constrained model gives the impression of low aerosol model uncertainty. However, these multiple "equifinal" models predict a wide range of forcings. To make progress, we need to develop a much deeper understanding of model uncertainty and ways to use observations to constrain it. Equifinality in the aerosol model means that tuning of a small number of model processes to achieve model-observation agreement could give a misleading impression of model robustness.

  14. Uncertainties about climate

    Laval, Katia; Laval, Guy

    2013-01-01

    Like meteorology, climatology is not an exact science: climate change forecasts necessarily include a share of uncertainty. It is precisely this uncertainty which is brandished and exploited by the opponents to the global warming theory to put into question the estimations of its future consequences. Is it legitimate to predict the future using the past climate data (well documented up to 100000 years BP) or the climates of other planets, taking into account the impreciseness of the measurements and the intrinsic complexity of the Earth's machinery? How is it possible to model a so huge and interwoven system for which any exact description has become impossible? Why water and precipitations play such an important role in local and global forecasts, and how should they be treated? This book written by two physicists answers with simpleness these delicate questions in order to give anyone the possibility to build his own opinion about global warming and the need to act rapidly

  15. Functional neuroimaging of belief, disbelief, and uncertainty.

    Harris, Sam; Sheth, Sameer A; Cohen, Mark S

    2008-02-01

    The difference between believing and disbelieving a proposition is one of the most potent regulators of human behavior and emotion. When one accepts a statement as true, it becomes the basis for further thought and action; rejected as false, it remains a string of words. The purpose of this study was to differentiate belief, disbelief, and uncertainty at the level of the brain. We used functional magnetic resonance imaging (fMRI) to study the brains of 14 adults while they judged written statements to be "true" (belief), "false" (disbelief), or "undecidable" (uncertainty). To characterize belief, disbelief, and uncertainty in a content-independent manner, we included statements from a wide range of categories: autobiographical, mathematical, geographical, religious, ethical, semantic, and factual. The states of belief, disbelief, and uncertainty differentially activated distinct regions of the prefrontal and parietal cortices, as well as the basal ganglia. Belief and disbelief differ from uncertainty in that both provide information that can subsequently inform behavior and emotion. The mechanism underlying this difference appears to involve the anterior cingulate cortex and the caudate. Although many areas of higher cognition are likely involved in assessing the truth-value of linguistic propositions, the final acceptance of a statement as "true" or its rejection as "false" appears to rely on more primitive, hedonic processing in the medial prefrontal cortex and the anterior insula. Truth may be beauty, and beauty truth, in more than a metaphorical sense, and false propositions may actually disgust us.

  16. Information theoretic quantification of diagnostic uncertainty.

    Westover, M Brandon; Eiseman, Nathaniel A; Cash, Sydney S; Bianchi, Matt T

    2012-01-01

    Diagnostic test interpretation remains a challenge in clinical practice. Most physicians receive training in the use of Bayes' rule, which specifies how the sensitivity and specificity of a test for a given disease combine with the pre-test probability to quantify the change in disease probability incurred by a new test result. However, multiple studies demonstrate physicians' deficiencies in probabilistic reasoning, especially with unexpected test results. Information theory, a branch of probability theory dealing explicitly with the quantification of uncertainty, has been proposed as an alternative framework for diagnostic test interpretation, but is even less familiar to physicians. We have previously addressed one key challenge in the practical application of Bayes theorem: the handling of uncertainty in the critical first step of estimating the pre-test probability of disease. This essay aims to present the essential concepts of information theory to physicians in an accessible manner, and to extend previous work regarding uncertainty in pre-test probability estimation by placing this type of uncertainty within a principled information theoretic framework. We address several obstacles hindering physicians' application of information theoretic concepts to diagnostic test interpretation. These include issues of terminology (mathematical meanings of certain information theoretic terms differ from clinical or common parlance) as well as the underlying mathematical assumptions. Finally, we illustrate how, in information theoretic terms, one can understand the effect on diagnostic uncertainty of considering ranges instead of simple point estimates of pre-test probability.

  17. The uncertainty principle

    Martens, Hans.

    1991-01-01

    The subject of this thesis is the uncertainty principle (UP). The UP is one of the most characteristic points of differences between quantum and classical mechanics. The starting point of this thesis is the work of Niels Bohr. Besides the discussion the work is also analyzed. For the discussion of the different aspects of the UP the formalism of Davies and Ludwig is used instead of the more commonly used formalism of Neumann and Dirac. (author). 214 refs.; 23 figs

  18. Decision Making Under Uncertainty

    2010-11-01

    A sound approach to rational decision making requires a decision maker to establish decision objectives, identify alternatives, and evaluate those...often violate the axioms of rationality when making decisions under uncertainty. The systematic description of such observations may lead to the...which leads to “anchoring” on the initial value. The fact that individuals have been shown to deviate from rationality when making decisions

  19. Economic uncertainty principle?

    Alexander Harin

    2006-01-01

    The economic principle of (hidden) uncertainty is presented. New probability formulas are offered. Examples of solutions of three types of fundamental problems are reviewed.; Principe d'incertitude économique? Le principe économique d'incertitude (cachée) est présenté. De nouvelles formules de chances sont offertes. Les exemples de solutions des trois types de problèmes fondamentaux sont reconsidérés.

  20. Citizen Candidates Under Uncertainty

    Eguia, Jon X.

    2005-01-01

    In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...

  1. Calibration Under Uncertainty.

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  2. Participation under Uncertainty

    Boudourides, Moses A.

    2003-01-01

    This essay reviews a number of theoretical perspectives about uncertainty and participation in the present-day knowledge-based society. After discussing the on-going reconfigurations of science, technology and society, we examine how appropriate for policy studies are various theories of social complexity. Post-normal science is such an example of a complexity-motivated approach, which justifies civic participation as a policy response to an increasing uncertainty. But there are different categories and models of uncertainties implying a variety of configurations of policy processes. A particular role in all of them is played by expertise whose democratization is an often-claimed imperative nowadays. Moreover, we discuss how different participatory arrangements are shaped into instruments of policy-making and framing regulatory processes. As participation necessitates and triggers deliberation, we proceed to examine the role and the barriers of deliberativeness. Finally, we conclude by referring to some critical views about the ultimate assumptions of recent European policy frameworks and the conceptions of civic participation and politicization that they invoke

  3. Uncertainty analysis techniques

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  4. Deterministic uncertainty analysis

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  5. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Kirchner, G.; Peterson, R.

    1996-11-01

    variation between the best estimate predictions of the group. The assumptions of the users result in more uncertainty in the predictions (taking into account the 95% confidence intervals) than is shown by the confidence interval on the predictions of one user. Mistakes, being examples of incorrect user assumptions, cannot be ignored and must be accepted as contributing to the variability seen in the spread of predictions. The user's confidence in his/her understanding of a scenario description and/or confidence in working with a code does not necessarily mean that the predictions will be more accurate. Choice of parameter values contributed most to user-induced uncertainty followed by scenario interpretation. The contribution due to code implementation was low, but may have been limited due to the decision of the majority of the group not to submit predictions using the most complex of the three codes. Most modelers had difficulty adapting the models for certain expected output. Parameter values for wet and dry deposition, transfer from forage to milk and concentration ratios were mostly taken from the extensive database of Chernobyl fallout radionuclides, no matter what the scenario. Examples provided in the code manuals may influence code users considerably when preparing their own input files. A major problem concerns pasture concentrations given in fresh or dry weight: parameter values in codes have to be based on one or the other and the request for predictions in the scenario description may or may not be the same unit. This is a surprisingly common source of error. Most of the predictions showed order of magnitude discrepancies when best estimates are compared with the observations, although the participants had a highly professional background in radioecology and a good understanding of the importance of the processes modelled. When uncertainties are considered, however, mostly there was overlap between predictions and observations. A failure to reproduce the time

  6. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Kirchner, G. [Univ. of Bremen (Germany); Peterson, R. [AECL, Chalk River, ON (Canada)] [and others

    1996-11-01

    variation between the best estimate predictions of the group. The assumptions of the users result in more uncertainty in the predictions (taking into account the 95% confidence intervals) than is shown by the confidence interval on the predictions of one user. Mistakes, being examples of incorrect user assumptions, cannot be ignored and must be accepted as contributing to the variability seen in the spread of predictions. The user's confidence in his/her understanding of a scenario description and/or confidence in working with a code does not necessarily mean that the predictions will be more accurate. Choice of parameter values contributed most to user-induced uncertainty followed by scenario interpretation. The contribution due to code implementation was low, but may have been limited due to the decision of the majority of the group not to submit predictions using the most complex of the three codes. Most modelers had difficulty adapting the models for certain expected output. Parameter values for wet and dry deposition, transfer from forage to milk and concentration ratios were mostly taken from the extensive database of Chernobyl fallout radionuclides, no matter what the scenario. Examples provided in the code manuals may influence code users considerably when preparing their own input files. A major problem concerns pasture concentrations given in fresh or dry weight: parameter values in codes have to be based on one or the other and the request for predictions in the scenario description may or may not be the same unit. This is a surprisingly common source of error. Most of the predictions showed order of magnitude discrepancies when best estimates are compared with the observations, although the participants had a highly professional background in radioecology and a good understanding of the importance of the processes modelled. When uncertainties are considered, however, mostly there was overlap between predictions and observations. A failure to reproduce the

  7. Uncertainties affecting fund collection, management and final utilisation

    Soederberg, Olof

    2006-01-01

    The paper presents, on a general level, major uncertainties in financing systems aiming at providing secure funding for future costs for decommissioning. The perspective chosen is that of a fund collector/manager. The paper also contains a description of how these uncertainties are dealt within the Swedish financing system and particularly from the perspective of the Board of the Swedish Nuclear Waste Fund. It is concluded that existing uncertainties are a good reason not to postpone decommissioning activities to a distant future. This aspect is important also when countries have in place financing systems that have been constructed in order to be robust against identified uncertainties. (author)

  8. Uncertainty modelling of critical column buckling for reinforced ...

    for columns, having major importance to a building's safety, are considered stability limits. ... Various research works have been carried out for uncertainty analysis in ... need appropriate material models, advanced structural simulation tools.

  9. Methodologies of Uncertainty Propagation Calculation

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  10. LOFT uncertainty-analysis methodology

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  11. LOFT uncertainty-analysis methodology

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  12. Uncertainties in Organ Burdens Estimated from PAS

    La Bone, T.R.

    2004-01-01

    To calculate committed effective dose equivalent, one needs to know the quantity of the radionuclide in all significantly irradiated organs (the organ burden) as a function of time following the intake. There are two major sources of uncertainty in an organ burden estimated from personal air sampling (PAS) data: (1) The uncertainty in going from the exposure measured with the PAS to the quantity of aerosol inhaled by the individual, and (2) The uncertainty in going from the intake to the organ burdens at any given time, taking into consideration the biological variability of the biokinetic models from person to person (interperson variability) and in one person over time (intra-person variability). We have been using biokinetic modeling methods developed by researchers at the University of Florida to explore the impact of inter-person variability on the uncertainty of organ burdens estimated from PAS data. These initial studies suggest that the uncertainties are so large that PAS might be considered to be a qualitative (rather than quantitative) technique. These results indicate that more studies should be performed to properly classify the reliability and usefulness of using PAS monitoring data to estimate organ burdens, organ dose, and ultimately CEDE

  13. Uncertainty and sampling issues in tank characterization

    Liebetrau, A.M.; Pulsipher, B.A.; Kashporenko, D.M.

    1997-06-01

    A defensible characterization strategy must recognize that uncertainties are inherent in any measurement or estimate of interest and must employ statistical methods for quantifying and managing those uncertainties. Estimates of risk and therefore key decisions must incorporate knowledge about uncertainty. This report focuses statistical methods that should be employed to ensure confident decision making and appropriate management of uncertainty. Sampling is a major source of uncertainty that deserves special consideration in the tank characterization strategy. The question of whether sampling will ever provide the reliable information needed to resolve safety issues is explored. The issue of sample representativeness must be resolved before sample information is reliable. Representativeness is a relative term but can be defined in terms of bias and precision. Currently, precision can be quantified and managed through an effective sampling and statistical analysis program. Quantifying bias is more difficult and is not being addressed under the current sampling strategies. Bias could be bounded by (1) employing new sampling methods that can obtain samples from other areas in the tanks, (2) putting in new risers on some worst case tanks and comparing the results from existing risers with new risers, or (3) sampling tanks through risers under which no disturbance or activity has previously occurred. With some bound on bias and estimates of precision, various sampling strategies could be determined and shown to be either cost-effective or infeasible

  14. From Hiroshima to Chernobyl: epidemiological findings, uncertainties and perceptions

    Burkart, W.; Hendry, J.

    2003-01-01

    The effects on persons of ionizing radiation can be quantified on three major pathways. At the present time, it is not possible to understand fully the basis of mechanistic principles of the interaction of ionizing radiation with the critical macromolecules, i.e. DNA, cell nuclei, cells, and body tissue, also because of the stochastic methods to be employed, the many factors influencing the situation, and the complex radiobiological mechanisms. Studies of the effects of radiation on animals, among other things, allow teratogenic changes and carcinogenicity to be studied as a function of many variables, such as the radiation dose and dose rates. However, these findings can be extrapolated to the situation in humans only to a limited extent. Even tighter constraints apply to the extrapolation of possible effects of low doses. On the whole, these studies generate important findings. The most important findings about the health effects of ionizing radiation to this day have arisen from statistical correlations of radiation exposures and the incidence of diseases in exposed groups of the population. The article presents the main results of studies conducted in the past and of ongoing studies, and cites remaining uncertainties. These uncertainties especially relate to the effects of low radiation levels. In this field, substantial problems exist, among other things, in the interpretation of data on the basis of varying environmental factors and the resultant absence of uniform conditions for evaluation and extrapolation to the low-dose range. (orig.) [de

  15. Uncertainty in ecological risk assessment: A statistician's view

    Smith, E.P.

    1995-01-01

    Uncertainty is a topic that has different meanings to researchers, modelers, managers and policy makers. The perspective of this presentation will be on the modeling view of uncertainty and its quantitative assessment. The goal is to provide some insight into how a statistician visualizes and addresses the issue of uncertainty in ecological risk assessment problems. In ecological risk assessment, uncertainty arises from many sources and is of different type depending on what is studies, where it is studied and how it is studied. Some major sources and their impact are described. A variety of quantitative approaches to modeling uncertainty are characterized and a general taxonomy given. Examples of risk assessments of lake acidification, power plant impact assessment and the setting of standards for chemicals will be used discuss approaches to quantitative assessment of uncertainty and some of the potential difficulties

  16. Do Orthopaedic Surgeons Acknowledge Uncertainty?

    Teunis, Teun; Janssen, Stein; Guitton, Thierry G.; Ring, David; Parisien, Robert

    2016-01-01

    Much of the decision-making in orthopaedics rests on uncertain evidence. Uncertainty is therefore part of our normal daily practice, and yet physician uncertainty regarding treatment could diminish patients' health. It is not known if physician uncertainty is a function of the evidence alone or if

  17. On the proper use of Ensembles for Predictive Uncertainty assessment

    Todini, Ezio; Coccia, Gabriele; Ortiz, Enrique

    2015-04-01

    Probabilistic forecasting has become popular in the last decades. Hydrological probabilistic forecasts have been based either on uncertainty processors (Krzysztofowic, 1999; Todini, 2004; Todini, 2008) or on ensembles, following meteorological traditional approaches and the establishment of the HEPEX program (http://hepex.irstea.fr. Unfortunately, the direct use of ensembles as a measure of the predictive density is an incorrect practice, because the ensemble measures the spread of the forecast instead of, following the definition of predictive uncertainty, the conditional probability of the future outcome conditional on the forecast. Only few correct approaches are reported in the literature, which correctly use the ensemble to estimate an expected conditional predictive density (Reggiani et al., 2009), similarly to what is done when several predictive models are available as in the BMA (Raftery et al., 2005) or MCP(Todini, 2008; Coccia and Todini, 2011) approaches. A major problem, limiting the correct use of ensembles, is in fact the difficulty of defining the time dependence of the ensemble members, due to the lack of a consistent ranking: in other words, when dealing with multiple models, the ith model remains the ith model regardless to the time of forecast, while this does not happen when dealing with ensemble members, since there is no definition for the ith member of an ensemble. Nonetheless, the MCP approach (Todini, 2008; Coccia and Todini, 2011), essentially based on a multiple regression in the Normal space, can be easily extended to use ensembles to represent the local (in time) smaller or larger conditional predictive uncertainty, as a function of the ensemble spread. This is done by modifying the classical linear regression equations, impliying perfectly observed predictors, to alternative regression equations similar to the Kalman filter ones, allowing for uncertain predictors. In this way, each prediction in time accounts for both the predictive

  18. Investment and uncertainty

    Greasley, David; Madsen, Jakob B.

    2006-01-01

    A severe collapse of fixed capital formation distinguished the onset of the Great Depression from other investment downturns between the world wars. Using a model estimated for the years 1890-2000, we show that the expected profitability of capital measured by Tobin's q, and the uncertainty...... surrounding expected profits indicated by share price volatility, were the chief influences on investment levels, and that heightened share price volatility played the dominant role in the crucial investment collapse in 1930. Investment did not simply follow the downward course of income at the onset...

  19. Optimization under Uncertainty

    Lopez, Rafael H.

    2016-01-06

    The goal of this poster is to present the main approaches to optimization of engineering systems in the presence of uncertainties. We begin by giving an insight about robust optimization. Next, we detail how to deal with probabilistic constraints in optimization, the so called the reliability based design. Subsequently, we present the risk optimization approach, which includes the expected costs of failure in the objective function. After that the basic description of each approach is given, the projects developed by CORE are presented. Finally, the main current topic of research of CORE is described.

  20. Optimizing production under uncertainty

    Rasmussen, Svend

    This Working Paper derives criteria for optimal production under uncertainty based on the state-contingent approach (Chambers and Quiggin, 2000), and discusses po-tential problems involved in applying the state-contingent approach in a normative context. The analytical approach uses the concept...... of state-contingent production functions and a definition of inputs including both sort of input, activity and alloca-tion technology. It also analyses production decisions where production is combined with trading in state-contingent claims such as insurance contracts. The final part discusses...

  1. Commonplaces and social uncertainty

    Lassen, Inger

    2008-01-01

    This article explores the concept of uncertainty in four focus group discussions about genetically modified food. In the discussions, members of the general public interact with food biotechnology scientists while negotiating their attitudes towards genetic engineering. Their discussions offer...... an example of risk discourse in which the use of commonplaces seems to be a central feature (Myers 2004: 81). My analyses support earlier findings that commonplaces serve important interactional purposes (Barton 1999) and that they are used for mitigating disagreement, for closing topics and for facilitating...

  2. Principles of Uncertainty

    Kadane, Joseph B

    2011-01-01

    An intuitive and mathematical introduction to subjective probability and Bayesian statistics. An accessible, comprehensive guide to the theory of Bayesian statistics, Principles of Uncertainty presents the subjective Bayesian approach, which has played a pivotal role in game theory, economics, and the recent boom in Markov Chain Monte Carlo methods. Both rigorous and friendly, the book contains: Introductory chapters examining each new concept or assumption Just-in-time mathematics -- the presentation of ideas just before they are applied Summary and exercises at the end of each chapter Discus

  3. Mathematical Analysis of Uncertainty

    Angel GARRIDO

    2016-01-01

    Full Text Available Classical Logic showed early its insufficiencies for solving AI problems. The introduction of Fuzzy Logic aims at this problem. There have been research in the conventional Rough direction alone or in the Fuzzy direction alone, and more recently, attempts to combine both into Fuzzy Rough Sets or Rough Fuzzy Sets. We analyse some new and powerful tools in the study of Uncertainty, as the Probabilistic Graphical Models, Chain Graphs, Bayesian Networks, and Markov Networks, integrating our knowledge of graphs and probability.

  4. Exploring the implication of climate process uncertainties within the Earth System Framework

    Booth, B.; Lambert, F. H.; McNeal, D.; Harris, G.; Sexton, D.; Boulton, C.; Murphy, J.

    2011-12-01

    Uncertainties in the magnitude of future climate change have been a focus of a great deal of research. Much of the work with General Circulation Models has focused on the atmospheric response to changes in atmospheric composition, while other processes remain outside these frameworks. Here we introduce an ensemble of new simulations, based on an Earth System configuration of HadCM3C, designed to explored uncertainties in both physical (atmospheric, oceanic and aerosol physics) and carbon cycle processes, using perturbed parameter approaches previously used to explore atmospheric uncertainty. Framed in the context of the climate response to future changes in emissions, the resultant future projections represent significantly broader uncertainty than existing concentration driven GCM assessments. The systematic nature of the ensemble design enables interactions between components to be explored. For example, we show how metrics of physical processes (such as climate sensitivity) are also influenced carbon cycle parameters. The suggestion from this work is that carbon cycle processes represent a comparable contribution to uncertainty in future climate projections as contributions from atmospheric feedbacks more conventionally explored. The broad range of climate responses explored within these ensembles, rather than representing a reason for inaction, provide information on lower likelihood but high impact changes. For example while the majority of these simulations suggest that future Amazon forest extent is resilient to the projected climate changes, a small number simulate dramatic forest dieback. This ensemble represents a framework to examine these risks, breaking them down into physical processes (such as ocean temperature drivers of rainfall change) and vegetation processes (where uncertainties point towards requirements for new observational constraints).

  5. Model uncertainty in financial markets : Long run risk and parameter uncertainty

    de Roode, F.A.

    2014-01-01

    Uncertainty surrounding key parameters of financial markets, such as the in- flation and equity risk premium, constitute a major risk for institutional investors with long investment horizons. Hedging the investors’ inflation exposure can be challenging due to the lack of domestic inflation-linked

  6. Investment, regulation, and uncertainty

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  7. Probabilistic Mass Growth Uncertainties

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  8. Embracing uncertainty in applied ecology.

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  9. Oil price uncertainty in Canada

    Elder, John [Department of Finance and Real Estate, 1272 Campus Delivery, Colorado State University, Fort Collins, CO 80523 (United States); Serletis, Apostolos [Department of Economics, University of Calgary, Calgary, Alberta (Canada)

    2009-11-15

    Bernanke [Bernanke, Ben S. Irreversibility, uncertainty, and cyclical investment. Quarterly Journal of Economics 98 (1983), 85-106.] shows how uncertainty about energy prices may induce optimizing firms to postpone investment decisions, thereby leading to a decline in aggregate output. Elder and Serletis [Elder, John and Serletis, Apostolos. Oil price uncertainty.] find empirical evidence that uncertainty about oil prices has tended to depress investment in the United States. In this paper we assess the robustness of these results by investigating the effects of oil price uncertainty in Canada. Our results are remarkably similar to existing results for the United States, providing additional evidence that uncertainty about oil prices may provide another explanation for why the sharp oil price declines of 1985 failed to produce rapid output growth. Impulse-response analysis suggests that uncertainty about oil prices may tend to reinforce the negative response of output to positive oil shocks. (author)

  10. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  11. Methods for handling uncertainty within pharmaceutical funding decisions

    Stevenson, Matt; Tappenden, Paul; Squires, Hazel

    2014-01-01

    This article provides a position statement regarding decision making under uncertainty within the economic evaluation of pharmaceuticals, with a particular focus upon the National Institute for Health and Clinical Excellence context within England and Wales. This area is of importance as funding agencies have a finite budget from which to purchase a selection of competing health care interventions. The objective function generally used is that of maximising societal health with an explicit acknowledgement that there will be opportunity costs associated with purchasing a particular intervention. Three components of uncertainty are discussed within a pharmaceutical funding perspective: methodological uncertainty, parameter uncertainty and structural uncertainty, alongside a discussion of challenges that are particularly pertinent to health economic evaluation. The discipline has focused primarily on handling methodological and parameter uncertainty and a clear reference case has been developed for consistency across evaluations. However, uncertainties still remain. Less attention has been given to methods for handling structural uncertainty. The lack of adequate methods to explicitly incorporate this aspect of model development may result in the true uncertainty surrounding health care investment decisions being underestimated. Research in this area is ongoing as we review.

  12. Fossil human remains from Bolomor Cave (Valencia, Spain).

    Arsuaga, Juan Luis; Fernández Peris, Josep; Gracia-Téllez, Ana; Quam, Rolf; Carretero, José Miguel; Barciela González, Virginia; Blasco, Ruth; Cuartero, Felipe; Sañudo, Pablo

    2012-05-01

    Systematic excavations carried out since 1989 at Bolomor Cave have led to the recovery of four Pleistocene human fossil remains, consisting of a fibular fragment, two isolated teeth, and a nearly complete adult parietal bone. All of these specimens date to the late Middle and early Late Pleistocene (MIS 7-5e). The fibular fragment shows thick cortical bone, an archaic feature found in non-modern (i.e. non-Homo sapiens) members of the genus Homo. Among the dental remains, the lack of a midtrigonid crest in the M(1) represents a departure from the morphology reported for the majority of Neandertal specimens, while the large dimensions and pronounced shoveling of the marginal ridges in the C(1) are similar to other European Middle and late Pleistocene fossils. The parietal bone is very thick, with dimensions that generally fall above Neandertal fossils and resemble more closely the Middle Pleistocene Atapuerca (SH) adult specimens. Based on the presence of archaic features, all the fossils from Bolomor are attributed to the Neandertal evolutionary lineage. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Qualitative uncertainty analysis in probabilistic safety assessment context

    Apostol, M.; Constantin, M; Turcu, I.

    2007-01-01

    In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)

  14. Heisenberg's principle of uncertainty and the uncertainty relations

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  15. Uncertainty as Certaint

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  16. Orientation and uncertainties

    Peters, H.P.; Hennen, L.

    1990-01-01

    The authors report on the results of three representative surveys that made a closer inquiry into perceptions and valuations of information and information sources concering Chernobyl. If turns out that the information sources are generally considered little trustworthy. This was generally attributable to the interpretation of the events being tied to attitudes in the atmonic energy issue. The greatest credit was given to television broadcasting. The authors summarize their discourse as follows: There is good reason to interpret the widespread uncertainty after Chernobyl as proof of the fact that large parts of the population are prepared and willing to assume a critical stance towards information and prefer to draw their information from various sources representing different positions. (orig.) [de

  17. DOD ELAP Lab Uncertainties

    2012-03-01

    ISO / IEC   17025  Inspection Bodies – ISO / IEC  17020  RMPs – ISO  Guide 34 (Reference...certify to :  ISO  9001 (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO / IEC   17025 :2005  Each has uncertainty...IPV6, NLLAP, NEFAP  TRAINING Programs  Certification Bodies – ISO / IEC  17021  Accreditation for  Management System 

  18. Traceability and Measurement Uncertainty

    Tosello, Guido; De Chiffre, Leonardo

    2004-01-01

    . The project partnership aims (composed by 7 partners in 5 countries, thus covering a real European spread in high tech production technology) to develop and implement an advanced e-learning system that integrates contributions from quite different disciplines into a user-centred approach that strictly....... Machine tool testing 9. The role of manufacturing metrology for QM 10. Inspection planning 11. Quality management of measurements incl. Documentation 12. Advanced manufacturing measurement technology The present report (which represents the section 2 - Traceability and Measurement Uncertainty – of the e-learning......This report is made as a part of the project ‘Metro-E-Learn: European e-Learning in Manufacturing Metrology’, an EU project under the program SOCRATES MINERVA (ODL and ICT in Education), Contract No: 101434-CP-1-2002-1-DE-MINERVA, coordinated by Friedrich-Alexander-University Erlangen...

  19. Sustainability and uncertainty

    Jensen, Karsten Klint

    2007-01-01

    The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one's mind about a range of difficult questions. One line of research (bottom-up) takes sustaining a system over time as its starting point and then infers prescriptions from...... this requirement. Another line (top-down) takes an economical interpretation of the Brundtland Commission's suggestion that the present generation's needsatisfaction should not compromise the need-satisfaction of future generations as its starting point. It then measures sustainability at the level of society...... a clarified ethical goal, disagreements can arise. At present we do not know what substitutions will be possible in the future. This uncertainty clearly affects the prescriptions that follow from the measure of sustainability. Consequently, decisions about how to make future agriculture sustainable...

  20. Propagation of radar rainfall uncertainty in urban flood simulations

    Liguori, Sara; Rico-Ramirez, Miguel

    2013-04-01

    This work discusses the results of the implementation of a novel probabilistic system designed to improve ensemble sewer flow predictions for the drainage network of a small urban area in the North of England. The probabilistic system has been developed to model the uncertainty associated to radar rainfall estimates and propagate it through radar-based ensemble sewer flow predictions. The assessment of this system aims at outlining the benefits of addressing the uncertainty associated to radar rainfall estimates in a probabilistic framework, to be potentially implemented in the real-time management of the sewer network in the study area. Radar rainfall estimates are affected by uncertainty due to various factors [1-3] and quality control and correction techniques have been developed in order to improve their accuracy. However, the hydrological use of radar rainfall estimates and forecasts remains challenging. A significant effort has been devoted by the international research community to the assessment of the uncertainty propagation through probabilistic hydro-meteorological forecast systems [4-5], and various approaches have been implemented for the purpose of characterizing the uncertainty in radar rainfall estimates and forecasts [6-11]. A radar-based ensemble stochastic approach, similar to the one implemented for use in the Southern-Alps by the REAL system [6], has been developed for the purpose of this work. An ensemble generator has been calibrated on the basis of the spatial-temporal characteristics of the residual error in radar estimates assessed with reference to rainfall records from around 200 rain gauges available for the year 2007, previously post-processed and corrected by the UK Met Office [12-13]. Each ensemble member is determined by summing a perturbation field to the unperturbed radar rainfall field. The perturbations are generated by imposing the radar error spatial and temporal correlation structure to purely stochastic fields. A

  1. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  2. Perceptual uncertainty supports design reasoning

    Tseng, Winger S. W.

    2018-06-01

    The unstructured, ambiguous figures used as design cues in the experiment were classified as being at high, moderate, and low ambiguity. Participants were required to use the ideas suggested by the visual cues to design a novel table. Results showed that different levels of ambiguity within the cues significantly influenced the quantity of idea development of expert designers, but not novice designers, whose idea generation remained relatively low across all levels of ambiguity. For experts, as the level of ambiguity in the cue increased so did the number of design ideas that were generated. Most design interpretations created by both experts and novices were affected by geometric contours within the figures. In addition, when viewing cues of high ambiguity, experts produced more interpretative transformations than when viewing cues of moderate or low ambiguity. Furthermore, experts produced significantly more new functions or meanings than novices. We claim that increased ambiguity within presented visual cues engenders uncertainty in designers that facilitates flexible transformations and interpretations that prevent premature commitment to uncreative solutions. Such results could be applied in design learning and education, focused on differences between experts and novices, to generalize the principles and strategies of interpretations by experts during concept sketching to train novices when face design problems, and the development of CACD tools to support designers.

  3. Essays on model uncertainty in financial models

    Li, Jing

    2018-01-01

    This dissertation studies model uncertainty, particularly in financial models. It consists of two empirical chapters and one theoretical chapter. The first empirical chapter (Chapter 2) classifies model uncertainty into parameter uncertainty and misspecification uncertainty. It investigates the

  4. Assessment of volcanic hazards, vulnerability, risk and uncertainty (Invited)

    Sparks, R. S.

    2009-12-01

    many sources of uncertainty in forecasting the areas that volcanic activity will effect and the severity of the effects. Uncertainties arise from: natural variability, inadequate data, biased data, incomplete data, lack of understanding of the processes, limitations to predictive models, ambiguity, and unknown unknowns. The description of volcanic hazards is thus necessarily probabilistic and requires assessment of the attendant uncertainties. Several issues arise from the probabilistic nature of volcanic hazards and the intrinsic uncertainties. Although zonation maps require well-defined boundaries for administrative pragmatism, such boundaries cannot divide areas that are completely safe from those that are unsafe. Levels of danger or safety need to be defined to decide on and justify boundaries through the concepts of vulnerability and risk. More data, better observations, improved models may reduce uncertainties, but can increase uncertainties and may lead to re-appraisal of zone boundaries. Probabilities inferred by statistical techniques are hard to communicate. Expert elicitation is an emerging methodology for risk assessment and uncertainty evaluation. The method has been applied at one major volcanic crisis (Soufrière Hills Volcano, Montserrat), and is being applied in planning for volcanic crises at Vesuvius.

  5. A new uncertainty importance measure

    Borgonovo, E.

    2007-01-01

    Uncertainty in parameters is present in many risk assessment problems and leads to uncertainty in model predictions. In this work, we introduce a global sensitivity indicator which looks at the influence of input uncertainty on the entire output distribution without reference to a specific moment of the output (moment independence) and which can be defined also in the presence of correlations among the parameters. We discuss its mathematical properties and highlight the differences between the present indicator, variance-based uncertainty importance measures and a moment independent sensitivity indicator previously introduced in the literature. Numerical results are discussed with application to the probabilistic risk assessment model on which Iman [A matrix-based approach to uncertainty and sensitivity analysis for fault trees. Risk Anal 1987;7(1):22-33] first introduced uncertainty importance measures

  6. Uncertainty Management and Sensitivity Analysis

    Rosenbaum, Ralph K.; Georgiadis, Stylianos; Fantke, Peter

    2018-01-01

    Uncertainty is always there and LCA is no exception to that. The presence of uncertainties of different types and from numerous sources in LCA results is a fact, but managing them allows to quantify and improve the precision of a study and the robustness of its conclusions. LCA practice sometimes...... suffers from an imbalanced perception of uncertainties, justifying modelling choices and omissions. Identifying prevalent misconceptions around uncertainties in LCA is a central goal of this chapter, aiming to establish a positive approach focusing on the advantages of uncertainty management. The main...... objectives of this chapter are to learn how to deal with uncertainty in the context of LCA, how to quantify it, interpret and use it, and how to communicate it. The subject is approached more holistically than just focusing on relevant statistical methods or purely mathematical aspects. This chapter...

  7. Additivity of entropic uncertainty relations

    René Schwonnek

    2018-03-01

    Full Text Available We consider the uncertainty between two pairs of local projective measurements performed on a multipartite system. We show that the optimal bound in any linear uncertainty relation, formulated in terms of the Shannon entropy, is additive. This directly implies, against naive intuition, that the minimal entropic uncertainty can always be realized by fully separable states. Hence, in contradiction to proposals by other authors, no entanglement witness can be constructed solely by comparing the attainable uncertainties of entangled and separable states. However, our result gives rise to a huge simplification for computing global uncertainty bounds as they now can be deduced from local ones. Furthermore, we provide the natural generalization of the Maassen and Uffink inequality for linear uncertainty relations with arbitrary positive coefficients.

  8. Tolerance for uncertainty in elderly people

    KHRYSTYNA KACHMARYK

    2014-09-01

    Full Text Available The aim of the study. The aim of the paper is a comparison of tolerance to uncertainty in two groups of elderly: the students of the University of the Third Age (UTA and older people who are not enrolled but help to educate grandchildren. A relation to uncertainty was shown to influence on decision making strategy of elderly that indicates on importance of the researches. Methods. To obtain the objectives of the paper the following methods were used: 1 Personal change readiness survey (PCRS adapted by Nickolay Bazhanov and Galina Bardiyer; 2 Tolerance Ambiguity Scale (TAS adapted by Galina Soldatova; 3 Freiburg personality inventory (FPI and 4 The questionnaire of self-relation by Vladimir Stolin and Sergej Panteleev. 40 socially involved elderly people were investigated according the above methods, 20 from UTA and 20 who are not studied and served as control group. Results. It was shown that relations of tolerance to uncertainty in the study group of students of the University of the Third Age substantially differ from relations of tolerance to uncertainty in group of older people who do not learn. The majority of students of the University of the Third Age have an inherent low tolerance for uncertainty, which is associated with an increase in expression personality traits and characteristics in self-relation. The group of the elderly who are not enrolled increasingly shows tolerance of uncertainty, focusing on the social and trusting relationship to meet the needs of communication, and the ability to manage their own emotions and desires than a group of Third Age university students. Conclusions. The results of experimental research of the third age university student’s peculiarities of the tolerance to uncertainty were outlined. It was found that decision making in the ambiguity situations concerning social interaction is well developed in elderly who do not study. The students of the University of Third Age have greater needs in

  9. Uncertainty Assessments in Fast Neutron Activation Analysis

    W. D. James; R. Zeisler

    2000-01-01

    Fast neutron activation analysis (FNAA) carried out with the use of small accelerator-based neutron generators is routinely used for major/minor element determinations in industry, mineral and petroleum exploration, and to some extent in research. While the method shares many of the operational procedures and therefore errors inherent to conventional thermal neutron activation analysis, its unique implementation gives rise to additional specific concerns that can result in errors or increased uncertainties of measured quantities. The authors were involved in a recent effort to evaluate irreversible incorporation of oxygen into a standard reference material (SRM) by direct measurement of oxygen by FNAA. That project required determination of oxygen in bottles of the SRM stored in varying environmental conditions and a comparison of the results. We recognized the need to accurately describe the total uncertainty of the measurements to accurately characterize any differences in the resulting average concentrations. It is our intent here to discuss the breadth of potential parameters that have the potential to contribute to the random and nonrandom errors of the method and provide estimates of the magnitude of uncertainty introduced. In addition, we will discuss the steps taken in this recent FNAA project to control quality, assess the uncertainty of the measurements, and evaluate results based on the statistical reproducibility

  10. Neural Correlates of Intolerance of Uncertainty in Clinical Disorders.

    Wever, Mirjam; Smeets, Paul; Sternheim, Lot

    2015-01-01

    Intolerance of uncertainty is a key contributor to anxiety-related disorders. Recent studies highlight its importance in other clinical disorders. The link between its clinical presentation and the underlying neural correlates remains unclear. This review summarizes the emerging literature on the neural correlates of intolerance of uncertainty. In conclusion, studies focusing on the neural correlates of this construct are sparse, and findings are inconsistent across disorders. Future research should identify neural correlates of intolerance of uncertainty in more detail. This may unravel the neurobiology of a wide variety of clinical disorders and pave the way for novel therapeutic targets.

  11. Decommissioning funding: ethics, implementation, uncertainties

    2006-01-01

    This status report on Decommissioning Funding: Ethics, Implementation, Uncertainties also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). The report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems. (authors)

  12. Chemical model reduction under uncertainty

    Najm, Habib; Galassi, R. Malpica; Valorani, M.

    2016-01-01

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  13. Chemical model reduction under uncertainty

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  14. The Uncertainty of Measurement Results

    Ambrus, A. [Hungarian Food Safety Office, Budapest (Hungary)

    2009-07-15

    Factors affecting the uncertainty of measurement are explained, basic statistical formulae given, and the theoretical concept explained in the context of pesticide formulation analysis. Practical guidance is provided on how to determine individual uncertainty components within an analytical procedure. An extended and comprehensive table containing the relevant mathematical/statistical expressions elucidates the relevant underlying principles. Appendix I provides a practical elaborated example on measurement uncertainty estimation, above all utilizing experimental repeatability and reproducibility laboratory data. (author)

  15. Uncertainty analysis of environmental models

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  16. Generalized uncertainty principle and quantum gravity phenomenology

    Bosso, Pasquale

    The fundamental physical description of Nature is based on two mutually incompatible theories: Quantum Mechanics and General Relativity. Their unification in a theory of Quantum Gravity (QG) remains one of the main challenges of theoretical physics. Quantum Gravity Phenomenology (QGP) studies QG effects in low-energy systems. The basis of one such phenomenological model is the Generalized Uncertainty Principle (GUP), which is a modified Heisenberg uncertainty relation and predicts a deformed canonical commutator. In this thesis, we compute Planck-scale corrections to angular momentum eigenvalues, the hydrogen atom spectrum, the Stern-Gerlach experiment, and the Clebsch-Gordan coefficients. We then rigorously analyze the GUP-perturbed harmonic oscillator and study new coherent and squeezed states. Furthermore, we introduce a scheme for increasing the sensitivity of optomechanical experiments for testing QG effects. Finally, we suggest future projects that may potentially test QG effects in the laboratory.

  17. Uncertainty quantification in resonance absorption

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  18. Reliability analysis under epistemic uncertainty

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  19. Simplified propagation of standard uncertainties

    Shull, A.H.

    1997-01-01

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards'' uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper

  20. Uncertainty Evaluation of Best Estimate Calculation Results

    Glaeser, H.

    2006-01-01

    Efforts are underway in Germany to perform analysis using best estimate computer codes and to include uncertainty evaluation in licensing. The German Reactor Safety Commission (RSK) issued a recommendation to perform uncertainty analysis in loss of coolant accident safety analyses (LOCA), recently. A more general requirement is included in a draft revision of the German Nuclear Regulation which is an activity of the German Ministry of Environment and Reactor Safety (BMU). According to the recommendation of the German RSK to perform safety analyses for LOCA in licensing the following deterministic requirements have still to be applied: Most unfavourable single failure, Unavailability due to preventive maintenance, Break location, Break size and break type, Double ended break, 100 percent through 200 percent, Large, medium and small break, Loss of off-site power, Core power (at accident initiation the most unfavourable conditions and values have to be assumed which may occur under normal operation taking into account the set-points of integral power and power density control. Measurement and calibration errors can be considered statistically), Time of fuel cycle. Analysis using best estimate codes with evaluation of uncertainties is the only way to quantify conservatisms with regard to code models and uncertainties of plant, fuel parameters and decay heat. This is especially the case for approaching licensing limits, e.g. due to power up-rates, higher burn-up and higher enrichment. Broader use of best estimate analysis is therefore envisaged in the future. Since some deterministic unfavourable assumptions regarding availability of NPP systems are still used, some conservatism in best-estimate analyses remains. Methods of uncertainty analyses have been developed and applied by the vendor Framatome ANP as well as by GRS in Germany. The GRS development was sponsored by the German Ministry of Economy and Labour (BMWA). (author)

  1. Estimating the measurement uncertainty in forensic blood alcohol analysis.

    Gullberg, Rod G

    2012-04-01

    For many reasons, forensic toxicologists are being asked to determine and report their measurement uncertainty in blood alcohol analysis. While understood conceptually, the elements and computations involved in determining measurement uncertainty are generally foreign to most forensic toxicologists. Several established and well-documented methods are available to determine and report the uncertainty in blood alcohol measurement. A straightforward bottom-up approach is presented that includes: (1) specifying the measurand, (2) identifying the major components of uncertainty, (3) quantifying the components, (4) statistically combining the components and (5) reporting the results. A hypothetical example is presented that employs reasonable estimates for forensic blood alcohol analysis assuming headspace gas chromatography. These computations are easily employed in spreadsheet programs as well. Determining and reporting measurement uncertainty is an important element in establishing fitness-for-purpose. Indeed, the demand for such computations and information from the forensic toxicologist will continue to increase.

  2. The time course of attention modulation elicited by spatial uncertainty.

    Huang, Dan; Liang, Huilou; Xue, Linyan; Wang, Meijian; Hu, Qiyi; Chen, Yao

    2017-09-01

    Uncertainty regarding the target location is an influential factor for spatial attention. Modulation in spatial uncertainty can lead to adjustments in attention scope and variations in attention effects. Hence, investigating spatial uncertainty modulation is important for understanding the underlying mechanism of spatial attention. However, the temporal dynamics of this modulation remains unclear. To evaluate the time course of spatial uncertainty modulation, we adopted a Posner-like attention orienting paradigm with central or peripheral cues. Different numbers of cues were used to indicate the potential locations of the target and thereby manipulate the spatial uncertainty level. The time interval between the onsets of the cue and the target (stimulus onset asynchrony, SOA) varied from 50 to 2000ms. We found that under central cueing, the effect of spatial uncertainty modulation could be detected from 200 to 2000ms after the presence of the cues. Under peripheral cueing, the effect of spatial uncertainty modulation was observed from 50 to 2000ms after cueing. Our results demonstrate that spatial uncertainty modulation produces robust and sustained effects on target detection speed. The time course of this modulation is influenced by the cueing method, which suggests that discrepant processing procedures are involved under different cueing conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Communicating uncertainty in hydrological forecasts: mission impossible?

    Ramos, Maria-Helena; Mathevet, Thibault; Thielen, Jutta; Pappenberger, Florian

    2010-05-01

    scenarios, is essential. We believe that the efficient communication of uncertainty in hydro-meteorological forecasts is not a mission impossible. Questions remaining unanswered in probabilistic hydrological forecasting should not neutralize the goal of such a mission, and the suspense kept should instead act as a catalyst for overcoming the remaining challenges.

  4. Sketching Uncertainty into Simulations.

    Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E

    2012-12-01

    In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.

  5. Uncertainty vs. Information (Invited)

    Nearing, Grey

    2017-04-01

    Information theory is the branch of logic that describes how rational epistemic states evolve in the presence of empirical data (Knuth, 2005), and any logic of science is incomplete without such a theory. Developing a formal philosophy of science that recognizes this fact results in essentially trivial solutions to several longstanding problems are generally considered intractable, including: • Alleviating the need for any likelihood function or error model. • Derivation of purely logical falsification criteria for hypothesis testing. • Specification of a general quantitative method for process-level model diagnostics. More generally, I make the following arguments: 1. Model evaluation should not proceed by quantifying and/or reducing error or uncertainty, and instead should be approached as a problem of ensuring that our models contain as much information as our experimental data. I propose that the latter is the only question a scientist actually has the ability to ask. 2. Instead of building geophysical models as solutions to differential equations that represent conservation laws, we should build models as maximum entropy distributions constrained by conservation symmetries. This will allow us to derive predictive probabilities directly from first principles. Knuth, K. H. (2005) 'Lattice duality: The origin of probability and entropy', Neurocomputing, 67, pp. 245-274.

  6. Big data uncertainties.

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  7. Uncertainty enabled Sensor Observation Services

    Cornford, Dan; Williams, Matthew; Bastin, Lucy

    2010-05-01

    Almost all observations of reality are contaminated with errors, which introduce uncertainties into the actual observation result. Such uncertainty is often held to be a data quality issue, and quantification of this uncertainty is essential for the principled exploitation of the observations. Many existing systems treat data quality in a relatively ad-hoc manner, however if the observation uncertainty is a reliable estimate of the error on the observation with respect to reality then knowledge of this uncertainty enables optimal exploitation of the observations in further processes, or decision making. We would argue that the most natural formalism for expressing uncertainty is Bayesian probability theory. In this work we show how the Open Geospatial Consortium Sensor Observation Service can be implemented to enable the support of explicit uncertainty about observations. We show how the UncertML candidate standard is used to provide a rich and flexible representation of uncertainty in this context. We illustrate this on a data set of user contributed weather data where the INTAMAP interpolation Web Processing Service is used to help estimate the uncertainty on the observations of unknown quality, using observations with known uncertainty properties. We then go on to discuss the implications of uncertainty for a range of existing Open Geospatial Consortium standards including SWE common and Observations and Measurements. We discuss the difficult decisions in the design of the UncertML schema and its relation and usage within existing standards and show various options. We conclude with some indications of the likely future directions for UncertML in the context of Open Geospatial Consortium services.

  8. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  9. Uncertainty analysis of power monitoring transit time ultrasonic flow meters

    Orosz, A.; Miller, D. W.; Christensen, R. N.; Arndt, S.

    2006-01-01

    A general uncertainty analysis is applied to chordal, transit time ultrasonic flow meters that are used in nuclear power plant feedwater loops. This investigation focuses on relationships between the major parameters of the flow measurement. For this study, mass flow rate is divided into three components, profile factor, density, and a form of volumetric flow rate. All system parameters are used to calculate values for these three components. Uncertainty is analyzed using a perturbation method. Sensitivity coefficients for major system parameters are shown, and these coefficients are applicable to a range of ultrasonic flow meters used in similar applications. Also shown is the uncertainty to be expected for density along with its relationship to other system uncertainties. One other conclusion is that pipe diameter sensitivity coefficients may be a function of the calibration technique used. (authors)

  10. New Evidence Links Stellar Remains to Oldest Recorded Supernova

    2006-09-01

    Recent observations have uncovered evidence that helps to confirm the identification of the remains of one of the earliest stellar explosions recorded by humans. The new study shows that the supernova remnant RCW 86 is much younger than previously thought. As such, the formation of the remnant appears to coincide with a supernova observed by Chinese astronomers in 185 A.D. The study used data from NASA's Chandra X-ray Observatory and the European Space Agency's XMM-Newton Observatory, "There have been previous suggestions that RCW 86 is the remains of the supernova from 185 A.D.," said Jacco Vink of University of Utrecht, the Netherlands, and lead author of the study. "These new X-ray data greatly strengthen the case." When a massive star runs out of fuel, it collapses on itself, creating a supernova that can outshine an entire galaxy. The intense explosion hurls the outer layers of the star into space and produces powerful shock waves. The remains of the star and the material it encounters are heated to millions of degrees and can emit intense X-ray radiation for thousands of years. Animation of a Massive Star Explosion Animation of a Massive Star Explosion In their stellar forensic work, Vink and colleagues studied the debris in RCW 86 to estimate when its progenitor star originally exploded. They calculated how quickly the shocked, or energized, shell is moving in RCW 86, by studying one part of the remnant. They combined this expansion velocity with the size of the remnant and a basic understanding of how supernovas expand to estimate the age of RCW 86. "Our new calculations tell us the remnant is about 2,000 years old," said Aya Bamba, a coauthor from the Institute of Physical and Chemical Research (RIKEN), Japan. "Previously astronomers had estimated an age of 10,000 years." The younger age for RCW 86 may explain an astronomical event observed almost 2000 years ago. In 185 AD, Chinese astronomers (and possibly the Romans) recorded the appearance of a new

  11. Scientific uncertainties and climate risks

    Petit, M.

    2005-01-01

    Human activities have induced a significant change in the Earth's atmospheric composition and, most likely, this trend will increase throughout the coming decades. During the last decades, the mean temperature has actually increased by the expected amount. Moreover, the geographical distribution of the warming, and day-to-night temperature variation have evolved as predicted. The magnitude of those changes is relatively small for the time being, but is expected to increase alarmingly during the coming decades. Greenhouse warming is a representative example of the problems of sustainable development: long-term risks can be estimated on a rational basis from scientific laws alone, but the non-specialist is generally not prepared to understand the steps required. However, even the non-specialist has obviously the right to decide about his way of life and the inheritance that he would like to leave for his children, but it is preferable that he is fully informed before making his decisions. Dialog, mutual understanding and confidence must prevail between Science and Society to avoid irrational actions. Controversy among experts is quite frequent. In the case of greenhouse warming, a commendable collective expertise has drastically reduced possible confusion. The Intergovernmental Panel on Climate Change was created jointly by the World Meteorology Organization (WMO) and the UN Program for the Environment (UNEP). Its reports evaluate the state of knowledge on past and future global climate changes, their impact, and the possibility of controlling anthropogenic emissions. The main targeted readers are, nevertheless, non-specialists, who should be made aware of results deduced from approaches that they may not be able to follow step by step. Moreover, these results, in particular, future projections, are, and will remain, subject to some uncertainty, which a fair description of the state of knowledge must include. Many misunderstandings between writers and readers can

  12. A commentary on model uncertainty

    Apostolakis, G.

    1994-01-01

    A framework is proposed for the identification of model and parameter uncertainties in risk assessment models. Two cases are distinguished; in the first case, a set of mutually exclusive and exhaustive hypotheses (models) can be formulated, while, in the second, only one reference model is available. The relevance of this formulation to decision making and the communication of uncertainties is discussed

  13. Mama Software Features: Uncertainty Testing

    Ruggiero, Christy E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Porter, Reid B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  14. Designing for Uncertainty: Three Approaches

    Bennett, Scott

    2007-01-01

    Higher education wishes to get long life and good returns on its investment in learning spaces. Doing this has become difficult because rapid changes in information technology have created fundamental uncertainties about the future in which capital investments must deliver value. Three approaches to designing for this uncertainty are described…

  15. Realising the Uncertainty Enabled Model Web

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  16. The Uncertainty Test for the MAAP Computer Code

    Park, S. H.; Song, Y. M.; Park, S. Y.; Ahn, K. I.; Kim, K. R.; Lee, Y. J.

    2008-01-01

    After the Three Mile Island Unit 2 (TMI-2) and Chernobyl accidents, safety issues for a severe accident are treated in various aspects. Major issues in our research part include a level 2 PSA. The difficulty in expanding the level 2 PSA as a risk information activity is the uncertainty. In former days, it attached a weight to improve the quality in a internal accident PSA, but the effort is insufficient for decrease the phenomenon uncertainty in the level 2 PSA. In our country, the uncertainty degree is high in the case of a level 2 PSA model, and it is necessary to secure a model to decrease the uncertainty. We have not yet experienced the uncertainty assessment technology, the assessment system itself depends on advanced nations. In advanced nations, the severe accident simulator is implemented in the hardware level. But in our case, basic function in a software level can be implemented. In these circumstance at home and abroad, similar instances are surveyed such as UQM and MELCOR. Referred to these instances, SAUNA (Severe Accident UNcertainty Analysis) system is being developed in our project to assess and decrease the uncertainty in a level 2 PSA. It selects the MAAP code to analyze the uncertainty in a severe accident

  17. One Approach to the Fire PSA Uncertainty Analysis

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2002-01-01

    Experienced practical events and findings from the number of fire probabilistic safety assessment (PSA) studies show that fire has high relative importance for nuclear power plant safety. Fire PSA is a very challenging phenomenon and a number of issues are still in the area of research and development. This has a major impact on the conservatism of fire PSA findings. One way to reduce the level of conservatism is to conduct uncertainty analysis. At the top-level, uncertainty of the fire PSA can be separated in to three segments. The first segment is related to fire initiating events frequencies. The second uncertainty segment is connected to the uncertainty of fire damage. Finally, there is uncertainty related to the PSA model, which propagates this fire-initiated damage to the core damage or other analyzed risk. This paper discusses all three segments of uncertainty. Some recent experience with fire PSA study uncertainty analysis, usage of fire analysis code COMPBRN IIIe, and uncertainty evaluation importance to the final result is presented.(author)

  18. Method and apparatus to predict the remaining service life of an operating system

    Greitzer, Frank L.; Kangas, Lars J.; Terrones, Kristine M.; Maynard, Melody A.; Pawlowski, Ronald A. , Ferryman; Thomas A.; Skorpik, James R.; Wilson, Bary W.

    2008-11-25

    A method and computer-based apparatus for monitoring the degradation of, predicting the remaining service life of, and/or planning maintenance for, an operating system are disclosed. Diagnostic information on degradation of the operating system is obtained through measurement of one or more performance characteristics by one or more sensors onboard and/or proximate the operating system. Though not required, it is preferred that the sensor data are validated to improve the accuracy and reliability of the service life predictions. The condition or degree of degradation of the operating system is presented to a user by way of one or more calculated, numeric degradation figures of merit that are trended against one or more independent variables using one or more mathematical techniques. Furthermore, more than one trendline and uncertainty interval may be generated for a given degradation figure of merit/independent variable data set. The trendline(s) and uncertainty interval(s) are subsequently compared to one or more degradation figure of merit thresholds to predict the remaining service life of the operating system. The present invention enables multiple mathematical approaches in determining which trendline(s) to use to provide the best estimate of the remaining service life.

  19. Measurement uncertainty: Friend or foe?

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  20. Model uncertainty in safety assessment

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  1. Model uncertainty in safety assessment

    Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).

  2. Global impact of uncertainties in China’s gas market

    Xunpeng, Shi; Variam, Hari Malamakkavu Padinjare; Tao, Jacqueline

    2017-01-01

    This paper examines the uncertainties in Chinese gas markets, analyze the reasons and quantify their impact on the world gas market. A literature review found significant variability among the outlooks on China's gas sector. Further assessment found that uncertainties in economic growth, structural change in markets, environmental regulations, price and institutional changes contribute to the uncertainties. The analysis of China’s demand and supply uncertainties with a world gas-trading model found significant changes in global production, trade patterns and spot prices, with pipeline exporters being most affected. China's domestic production and pipeline imports from Central Asia are the major buffers that can offset much of the uncertainties. The study finds an asymmetric phenomenon. Pipeline imports are responding to China's uncertainties in both low and high demand scenarios while LNG imports are only responding to high demand scenario. The major reasons are higher TOP levels and the current practice of import only up to the minimum TOP levels for LNG, as well as a lack of liberalized gas markets. The study shows that it is necessary to create LNG markets that can respond to market dynamics, through either a reduction of TOP levels or change of pricing mechanisms to hub indexation. - Highlights: • Economic growth, regulations, reforms and shale gas cause the uncertainties. • Pipeline exporters to China and Southeast Asian and Australian LNG exporters affected the most. • China’s domestic production and pipe imports offset much of the uncertainties. • Pipeline imports are responding to China’s uncertainties in both low and high demand. • LNG imports are only responding to high demand scenario.

  3. Model uncertainty: Probabilities for models?

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  4. Pragmatic aspects of uncertainty propagation: A conceptual review

    Thacker, W.Carlisle; Iskandarani, Mohamad; Gonç alves, Rafael C.; Srinivasan, Ashwanth; Knio, Omar

    2015-01-01

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  5. Pragmatic aspects of uncertainty propagation: A conceptual review

    Thacker, W.Carlisle

    2015-09-11

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  6. Decision-making under great uncertainty

    Hansson, S.O.

    1992-01-01

    Five types of decision-uncertainty are distinguished: uncertainty of consequences, of values, of demarcation, of reliance, and of co-ordination. Strategies are proposed for each type of uncertainty. The general conclusion is that it is meaningful for decision theory to treat cases with greater uncertainty than the textbook case of 'decision-making under uncertainty'. (au)

  7. To be certain about the uncertainty: Bayesian statistics for 13 C metabolic flux analysis.

    Theorell, Axel; Leweke, Samuel; Wiechert, Wolfgang; Nöh, Katharina

    2017-11-01

    13 C Metabolic Fluxes Analysis ( 13 C MFA) remains to be the most powerful approach to determine intracellular metabolic reaction rates. Decisions on strain engineering and experimentation heavily rely upon the certainty with which these fluxes are estimated. For uncertainty quantification, the vast majority of 13 C MFA studies relies on confidence intervals from the paradigm of Frequentist statistics. However, it is well known that the confidence intervals for a given experimental outcome are not uniquely defined. As a result, confidence intervals produced by different methods can be different, but nevertheless equally valid. This is of high relevance to 13 C MFA, since practitioners regularly use three different approximate approaches for calculating confidence intervals. By means of a computational study with a realistic model of the central carbon metabolism of E. coli, we provide strong evidence that confidence intervals used in the field depend strongly on the technique with which they were calculated and, thus, their use leads to misinterpretation of the flux uncertainty. In order to provide a better alternative to confidence intervals in 13 C MFA, we demonstrate that credible intervals from the paradigm of Bayesian statistics give more reliable flux uncertainty quantifications which can be readily computed with high accuracy using Markov chain Monte Carlo. In addition, the widely applied chi-square test, as a means of testing whether the model reproduces the data, is examined closer. © 2017 Wiley Periodicals, Inc.

  8. Risk in technical and scientific studies: general introduction to uncertainty management and the concept of risk

    Apostolakis, G.E.

    2004-01-01

    George Apostolakis (MIT) presented an introduction to the concept of risk and uncertainty management and their use in technical and scientific studies. He noted that Quantitative Risk Assessment (QRA) provides support to the overall treatment of a system as an integrated socio-technical system. Specifically, QRA aims to answer the questions: - What can go wrong (e.g., accident sequences or scenarios)? - How likely are these sequences or scenarios? - What are the consequences of these sequences or scenarios? The Quantitative Risk Assessment deals with two major types of uncertainty. An assessment requires a 'model of the world', and this preferably would be a deterministic model based on underlying processes. In practice, there are uncertainties in this model of the world relating to variability or randomness that cannot be accounted for directly in a deterministic model and that may require a probabilistic or aleatory model. Both deterministic and aleatory models of the world have assumptions and parameters, and there are 'state-of-knowledge' or epistemic uncertainties associated with these. Sensitivity studies or eliciting expert opinion can be used to address the uncertainties in assumptions, and the level of confidence in parameter values can be characterised using probability distributions (pdfs). Overall, the distinction between aleatory and epistemic uncertainties is not always clear, and both can be treated mathematically in the same way. Lessons on safety assessments that can be learnt from experience at nuclear power plants are that beliefs about what is important can be wrong if a risk assessment is not performed. Also, precautionary approaches are not always conservative if failure modes are not identified. Nevertheless, it is important to recognize that uncertainties will remain despite a quantitative risk assessment: e.g., is the scenario list complete, are the models accepted as reasonable, and are parameter probability distributions representative of

  9. Uncertainties in risk assessment at USDOE facilities

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  10. Managing Uncertainty for an Integrated Fishery

    MB Hasan

    2012-06-01

    Full Text Available This paper investigates ways to deal with the uncertainties in fishing trawler scheduling and production planning in a quota-based integrated commercial fishery. A commercial fishery faces uncertainty mainly from variation in catch rate, which may be due to weather, and other environmental factors. The firm tries to manage this uncertainty through planning co-ordination of fishing trawler scheduling, catch quota, processing and labour allocation, and inventory control. Scheduling must necessarily be done over some finite planning horizon, and the trawler schedule itself introduces man-made variability, which in turn induces inventory in the processing plant. This induced inventory must be managed, complicated by the inability to plan easily beyond the current planning horizon. We develop a surprisingly simple innovation in inventory, which we have not seen in other papers on production management, which of requiring beginning inventory to equal ending inventory. This tool gives management a way to calculate a profit-maximizing safety stock that counter-acts the man-made variability due to the trawler scheduling. We found that the variability of catch rate had virtually no effects on the profitability with inventory. We report numerical results for several planning horizon models, based on data for a major New Zealand fishery.

  11. Uncertainties in risk assessment at USDOE facilities

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms open-quote risk assessment close-quote and open-quote risk management close-quote are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of open-quotes... the most significant data and uncertainties...close quotes in an assessment. Significant data and uncertainties are open-quotes...those that define and explain the main risk conclusionsclose quotes. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation

  12. Nanomedicine: Governing uncertainties

    Trisolino, Antonella

    Nanomedicine is a promising and revolutionary field to improve medical diagnoses and therapies leading to a higher quality of life for everybody. Huge benefits are expected from nanomedicine applications such as in diagnostic and therapeutic field. However, nanomedicine poses several issues on risks to the human health. This thesis aims to defense a perspective of risk governance that sustains scientific knowledge process by developing guidelines and providing the minimum safety standards acceptable to protect the human health. Although nanomedicine is in an early stage of its discovery, some cautious measures are required to provide regulatory mechanisms able to response to the unique set of challenges associated to nanomedicine. Nanotechnology offers an unique opportunity to intensify a major interplay between different disciplines such as science and law. This multidisciplinary approach can positively contributes to find reliable regulatory choices and responsive normative tools in dealing with challenges of novel technologies.

  13. The Uncertainties of Risk Management

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...

  14. Climate Projections and Uncertainty Communication.

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  15. Relational uncertainty in service dyads

    Kreye, Melanie

    2017-01-01

    in service dyads and how they resolve it through suitable organisational responses to increase the level of service quality. Design/methodology/approach: We apply the overall logic of Organisational Information-Processing Theory (OIPT) and present empirical insights from two industrial case studies collected...... the relational uncertainty increased the functional quality while resolving the partner’s organisational uncertainty increased the technical quality of the delivered service. Originality: We make two contributions. First, we introduce relational uncertainty to the OM literature as the inability to predict...... and explain the actions of a partnering organisation due to a lack of knowledge about their abilities and intentions. Second, we present suitable organisational responses to relational uncertainty and their effect on service quality....

  16. Advanced LOCA code uncertainty assessment

    Wickett, A.J.; Neill, A.P.

    1990-11-01

    This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)

  17. How to live with uncertainties?

    Michel, R.

    2012-01-01

    In a short introduction, the problem of uncertainty as a general consequence of incomplete information as well as the approach to quantify uncertainty in metrology are addressed. A little history of the more than 30 years of the working group AK SIGMA is followed by an appraisal of its up-to-now achievements. Then, the potential future of the AK SIGMA is discussed based on its actual tasks and on open scientific questions and future topics. (orig.)

  18. Some remarks on modeling uncertainties

    Ronen, Y.

    1983-01-01

    Several topics related to the question of modeling uncertainties are considered. The first topic is related to the use of the generalized bias operator method for modeling uncertainties. The method is expanded to a more general form of operators. The generalized bias operator is also used in the inverse problem and applied to determine the anisotropic scattering law. The last topic discussed is related to the question of the limit to accuracy and how to establish its value. (orig.) [de

  19. Uncertainty analysis in safety assessment

    Lemos, Francisco Luiz de; Sullivan, Terry

    1997-01-01

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)

  20. Propagation of dynamic measurement uncertainty

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  1. Optimal Taxation under Income Uncertainty

    Xianhua Dai

    2011-01-01

    Optimal taxation under income uncertainty has been extensively developed in expected utility theory, but it is still open for inseparable utility function between income and effort. As an alternative of decision-making under uncertainty, prospect theory (Kahneman and Tversky (1979), Tversky and Kahneman (1992)) has been obtained empirical support, for example, Kahneman and Tversky (1979), and Camerer and Lowenstein (2003). It is beginning to explore optimal taxation in the context of prospect...

  2. New Perspectives on Policy Uncertainty

    Hlatshwayo, Sandile

    2017-01-01

    In recent years, the ubiquitous and intensifying nature of economic policy uncertainty has made it a popular explanation for weak economic performance in developed and developing markets alike. The primary channel for this effect is decreased and delayed investment as firms adopt a ``wait and see'' approach to irreversible investments (Bernanke, 1983; Dixit and Pindyck, 1994). Deep empirical examination of policy uncertainty's impact is rare because of the difficulty associated in measuring i...

  3. High-temperature uncertainty

    Timusk, T.

    2005-01-01

    Recent experiments reveal that the mechanism responsible for the superconducting properties of cuprate materials is even more mysterious than we thought. Two decades ago, Georg Bednorz and Alex Mueller of IBM's research laboratory in Zurich rocked the world of physics when they discovered a material that lost all resistance to electrical current at the record temperature of 36 K. Until then, superconductivity was thought to be a strictly low-temperature phenomenon that required costly refrigeration. Moreover, the IBM discovery - for which Bednorz and Mueller were awarded the 1987 Nobel Prize for Physics - was made in a ceramic copper-oxide material that nobody expected to be particularly special. Proposed applications for these 'cuprates' abounded. High-temperature superconductivity, particularly if it could be extended to room temperature, offered the promise of levitating trains, ultra-efficient power cables, and even supercomputers based on superconducting quantum interference devices. But these applications have been slow to materialize. Moreover, almost 20 years on, the physics behind this strange state of matter remains a mystery. (U.K.)

  4. Toward best practice framing of uncertainty in scientific publications: A review of Water Resources Research abstracts

    Guillaume, Joseph H. A.; Helgeson, Casey; Elsawah, Sondoss; Jakeman, Anthony J.; Kummu, Matti

    2017-08-01

    Uncertainty is recognized as a key issue in water resources research, among other sciences. Discussions of uncertainty typically focus on tools and techniques applied within an analysis, e.g., uncertainty quantification and model validation. But uncertainty is also addressed outside the analysis, in writing scientific publications. The language that authors use conveys their perspective of the role of uncertainty when interpreting a claim—what we call here "framing" the uncertainty. This article promotes awareness of uncertainty framing in four ways. (1) It proposes a typology of eighteen uncertainty frames, addressing five questions about uncertainty. (2) It describes the context in which uncertainty framing occurs. This is an interdisciplinary topic, involving philosophy of science, science studies, linguistics, rhetoric, and argumentation. (3) We analyze the use of uncertainty frames in a sample of 177 abstracts from the Water Resources Research journal in 2015. This helped develop and tentatively verify the typology, and provides a snapshot of current practice. (4) We make provocative recommendations to achieve a more influential, dynamic science. Current practice in uncertainty framing might be described as carefully considered incremental science. In addition to uncertainty quantification and degree of belief (present in ˜5% of abstracts), uncertainty is addressed by a combination of limiting scope, deferring to further work (˜25%) and indicating evidence is sufficient (˜40%)—or uncertainty is completely ignored (˜8%). There is a need for public debate within our discipline to decide in what context different uncertainty frames are appropriate. Uncertainty framing cannot remain a hidden practice evaluated only by lone reviewers.

  5. Pharmacological Fingerprints of Contextual Uncertainty.

    Louise Marshall

    2016-11-01

    Full Text Available Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses.

  6. Statistically based uncertainty assessments in nuclear risk analysis

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  7. A Bayesian approach to model uncertainty

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  8. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  9. Implications of nuclear data uncertainties to reactor design

    Greebler, P.; Hutchins, B.A.; Cowan, C.L.

    1970-01-01

    Uncertainties in nuclear data require significant allowances to be made in the design and the operating conditions of reactor cores and of shielded-reactor-plant and fuel-processing systems. These allowances result in direct cost increases due to overdesign of components and equipment and reduced core and fuel operating performance. Compromising the allowances for data uncertainties has indirect cost implications due to increased risks of failure to meet plant and fuel performance objectives, with warrantees involved in some cases, and to satisfy licensed safety requirements. Fast breeders are the most sensitive power reactors to the uncertainties in nuclear data over the neutron energy range of interest for fission reactors, and this paper focuses on the implications of the data uncertainties to design and operation of fast breeder reactors and fuel-processing systems. The current status of uncertainty in predicted physics parameters due to data uncertainties is reviewed and compared with the situation in 1966 and that projected for within the next two years due to anticipated data improvements. Implications of the uncertainties in the predicted physics parameters to design and operation are discussed for both a near-term prototype or demonstration breeder plant (∼300 MW(e)) and a longer-term large (∼1000 MW(e)) plant. Significant improvements in the nuclear data have been made during the past three years, the most important of these to fast power reactors being the 239 Pu alpha below 15 keV. The most important remaining specific data uncertainties are illustrated by their individual contributions to the computational uncertainty of selected physics parameters, and recommended priorities and accuracy requirements for improved data are presented

  10. Neural correlates of intolerance of uncertainty in clinical disorders

    Wever, Mirjam; Smeets, Paul; Sternheim, Lot

    2015-01-01

    Intolerance of uncertainty is a key contributor to anxiety-related disorders. Recent studies highlight its importance in other clinical disorders. The link between its clinical presentation and the underlying neural correlates remains unclear. This review summarizes the emerging literature on the

  11. Neural Correlates of Intolerance of Uncertainty in Clinical Disorders

    Wever, M.; Smeets, P.A.M.; Sternheim, L.

    2015-01-01

    Intolerance of uncertainty is a key contributor to anxiety-related disorders. Recent studies highlight its importance in other clinical disorders. The link between its clinical presentation and the underlying neural correlates remains unclear. This review summarizes the emerging literature on the

  12. Planning ATES systems under uncertainty

    Jaxa-Rozen, Marc; Kwakkel, Jan; Bloemendal, Martin

    2015-04-01

    Aquifer Thermal Energy Storage (ATES) can contribute to significant reductions in energy use within the built environment, by providing seasonal energy storage in aquifers for the heating and cooling of buildings. ATES systems have experienced a rapid uptake over the last two decades; however, despite successful experiments at the individual level, the overall performance of ATES systems remains below expectations - largely due to suboptimal practices for the planning and operation of systems in urban areas. The interaction between ATES systems and underground aquifers can be interpreted as a common-pool resource problem, in which thermal imbalances or interference could eventually degrade the storage potential of the subsurface. Current planning approaches for ATES systems thus typically follow the precautionary principle. For instance, the permitting process in the Netherlands is intended to minimize thermal interference between ATES systems. However, as shown in recent studies (Sommer et al., 2015; Bakr et al., 2013), a controlled amount of interference may benefit the collective performance of ATES systems. An overly restrictive approach to permitting is instead likely to create an artificial scarcity of available space, limiting the potential of the technology in urban areas. In response, master plans - which take into account the collective arrangement of multiple systems - have emerged as an increasingly popular alternative. However, permits and master plans both take a static, ex ante view of ATES governance, making it difficult to predict the effect of evolving ATES use or climactic conditions on overall performance. In particular, the adoption of new systems by building operators is likely to be driven by the available subsurface space and by the performance of existing systems; these outcomes are themselves a function of planning parameters. From this perspective, the interactions between planning authorities, ATES operators, and subsurface conditions

  13. Incorporating the effects of socioeconomic uncertainty into priority setting for conservation investment.

    McBride, Marissa F; Wilson, Kerrie A; Bode, Michael; Possingham, Hugh P

    2007-12-01

    Uncertainty in the implementation and outcomes of conservation actions that is not accounted for leaves conservation plans vulnerable to potential changes in future conditions. We used a decision-theoretic approach to investigate the effects of two types of investment uncertainty on the optimal allocation of global conservation resources for land acquisition in the Mediterranean Basin. We considered uncertainty about (1) whether investment will continue and (2) whether the acquired biodiversity assets are secure, which we termed transaction uncertainty and performance uncertainty, respectively. We also developed and tested the robustness of different rules of thumb for guiding the allocation of conservation resources when these sources of uncertainty exist. In the presence of uncertainty in future investment ability (transaction uncertainty), the optimal strategy was opportunistic, meaning the investment priority should be to act where uncertainty is highest while investment remains possible. When there was a probability that investments would fail (performance uncertainty), the optimal solution became a complex trade-off between the immediate biodiversity benefits of acting in a region and the perceived longevity of the investment. In general, regions were prioritized for investment when they had the greatest performance certainty, even if an alternative region was highly threatened or had higher biodiversity value. The improved performance of rules of thumb when accounting for uncertainty highlights the importance of explicitly incorporating sources of investment uncertainty and evaluating potential conservation investments in the context of their likely long-term success.

  14. Genetic diversity and connectivity remain high in eelgrass Zostera marina populations in the Wadden Sea, despite major impacts

    Ferber, Steven; Stam, Wytze T.; Olsen, Jeanine L.

    2008-01-01

    Beginning in the 1930s, eelgrass meadows declined throughout the Wadden Sea, leaving populations susceptible to extinction through patchiness, low density and isolation. Additional anthropogenic impacts have altered current regimes, nutrients and turbidity-all of which affect eelgrass. Recent

  15. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  16. Ruminations On NDA Measurement Uncertainty Compared TO DA Uncertainty

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-01-01

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  17. Uncertainty in estimating and mitigating industrial related GHG emissions

    El-Fadel, M.; Zeinati, M.; Ghaddar, N.; Mezher, T.

    2001-01-01

    Global climate change has been one of the challenging environmental concerns facing policy makers in the past decade. The characterization of the wide range of greenhouse gas emissions sources and sinks as well as their behavior in the atmosphere remains an on-going activity in many countries. Lebanon, being a signatory to the Framework Convention on Climate Change, is required to submit and regularly update a national inventory of greenhouse gas emissions sources and removals. Accordingly, an inventory of greenhouse gases from various sectors was conducted following the guidelines set by the United Nations Intergovernmental Panel on Climate Change (IPCC). The inventory indicated that the industrial sector contributes about 29% to the total greenhouse gas emissions divided between industrial processes and energy requirements at 12 and 17%, respectively. This paper describes major mitigation scenarios to reduce emissions from this sector based on associated technical, economic, environmental, and social characteristics. Economic ranking of these scenarios was conducted and uncertainty in emission factors used in the estimation process was emphasized. For this purpose, theoretical and experimental emission factors were used as alternatives to default factors recommended by the IPCC and the significance of resulting deviations in emission estimation is presented. (author)

  18. OPEC Middle East plans for rising world demand amid uncertainty

    Ismail, I.A.H.

    1996-01-01

    The Middle Eastern members of the Organization of Petroleum Exporting Countries must plan for huge increases in oil production capacity yet wonder whether markets for the new output will develop as expected. With worldwide oil consumption rising and non-OPEC output likely to reach its resource limits soon, OPEC member countries face major gains in demand for their crude oil. To meet the demand growth, those with untapped resources will have to invest heavily in production capacity. Most OPEC members with such resources are in the Middle East. But financing the capacity investments remains a challenge. Some OPEC members have opened up to foreign equity participation in production projects, and others may eventually do so as financial pressures grow. That means additions to the opportunities now available to international companies in the Middle East. Uncertainties, however, hamper planning and worry OPEC. Chief among them are taxation and environmental policies of consuming-nation governments. This paper reviews these concerns and provides data on production, pricing, capital investment histories and revenues

  19. Critical loads - assessment of uncertainty

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  20. Uncertainty Quantification in Numerical Aerodynamics

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  1. Uncertainty in spatial planning proceedings

    Aleš Mlakar

    2009-01-01

    Full Text Available Uncertainty is distinctive of spatial planning as it arises from the necessity to co-ordinate the various interests within the area, from the urgency of adopting spatial planning decisions, the complexity of the environment, physical space and society, addressing the uncertainty of the future and from the uncertainty of actually making the right decision. Response to uncertainty is a series of measures that mitigate the effects of uncertainty itself. These measures are based on two fundamental principles – standardization and optimization. The measures are related to knowledge enhancement and spatial planning comprehension, in the legal regulation of changes, in the existence of spatial planning as a means of different interests co-ordination, in the active planning and the constructive resolution of current spatial problems, in the integration of spatial planning and the environmental protection process, in the implementation of the analysis as the foundation of spatial planners activities, in the methods of thinking outside the parameters, in forming clear spatial concepts and in creating a transparent management spatial system and also in the enforcement the participatory processes.

  2. Uncertainty modeling and decision support

    Yager, Ronald R.

    2004-01-01

    We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function

  3. Uncertainty Assessment: What Good Does it Do? (Invited)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.

  4. On the uncertainty principle. V

    Halpern, O.

    1976-01-01

    The treatment of ideal experiments connected with the uncertainty principle is continued. The author analyzes successively measurements of momentum and position, and discusses the common reason why the results in all cases differ from the conventional ones. A similar difference exists for the measurement of field strengths. The interpretation given by Weizsaecker, who tried to interpret Bohr's complementarity principle by introducing a multi-valued logic is analyzed. The treatment of the uncertainty principle ΔE Δt is deferred to a later paper as is the interpretation of the method of variation of constants. Every ideal experiment discussed shows various lower limits for the value of the uncertainty product which limits depend on the experimental arrangement and are always (considerably) larger than h. (Auth.)

  5. Davis-Besse uncertainty study

    Davis, C.B.

    1987-08-01

    The uncertainties of calculations of loss-of-feedwater transients at Davis-Besse Unit 1 were determined to address concerns of the US Nuclear Regulatory Commission relative to the effectiveness of feed and bleed cooling. Davis-Besse Unit 1 is a pressurized water reactor of the raised-loop Babcock and Wilcox design. A detailed, quality-assured RELAP5/MOD2 model of Davis-Besse was developed at the Idaho National Engineering Laboratory. The model was used to perform an analysis of the loss-of-feedwater transient that occurred at Davis-Besse on June 9, 1985. A loss-of-feedwater transient followed by feed and bleed cooling was also calculated. The evaluation of uncertainty was based on the comparisons of calculations and data, comparisons of different calculations of the same transient, sensitivity calculations, and the propagation of the estimated uncertainty in initial and boundary conditions to the final calculated results

  6. Decommissioning Funding: Ethics, Implementation, Uncertainties

    2007-01-01

    This status report on decommissioning funding: ethics, implementation, uncertainties is based on a review of recent literature and materials presented at NEA meetings in 2003 and 2004, and particularly at a topical session organised in November 2004 on funding issues associated with the decommissioning of nuclear power facilities. The report also draws on the experience of the NEA Working Party on Decommissioning and Dismantling (WPDD). This report offers, in a concise form, an overview of relevant considerations on decommissioning funding mechanisms with regard to ethics, implementation and uncertainties. Underlying ethical principles found in international agreements are identified, and factors influencing the accumulation and management of funds for decommissioning nuclear facilities are discussed together with the main sources of uncertainties of funding systems

  7. Correlated uncertainties in integral data

    McCracken, A.K.

    1978-01-01

    The use of correlated uncertainties in calculational data is shown in cases investigated to lead to a reduction in the uncertainty of calculated quantities of importance to reactor design. It is stressed however that such reductions are likely to be important in a minority of cases of practical interest. The effect of uncertainties in detector cross-sections is considered and is seen to be, in some cases, of equal importance to that in the data used in calculations. Numerical investigations have been limited by the sparse information available on data correlations; some comparisons made of these data reveal quite large inconsistencies for both detector cross-sections and cross-section of interest for reactor calculations

  8. Uncertainty and Sensitivity Analyses Plan

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project

  9. Geological-structural models used in SR 97. Uncertainty analysis

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  10. Geological-structural models used in SR 97. Uncertainty analysis

    Saksa, P.; Nummela, J.

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the

  11. A Bayesian Framework for Remaining Useful Life Estimation

    National Aeronautics and Space Administration — The estimation of remaining useful life (RUL) of a faulty component is at the center of system prognostics and health management. It gives operators a potent tool in...

  12. Audit of the global carbon budget: estimate errors and their impact on uptake uncertainty

    Ballantyne, A. P.; Andres, R.; Houghton, R.; Stocker, B. D.; Wanninkhof, R.; Anderegg, W.; Cooper, L. A.; DeGrandpre, M.; Tans, P. P.; Miller, J. B.; Alden, C.; White, J. W. C.

    2015-04-01

    Over the last 5 decades monitoring systems have been developed to detect changes in the accumulation of carbon (C) in the atmosphere and ocean; however, our ability to detect changes in the behavior of the global C cycle is still hindered by measurement and estimate errors. Here we present a rigorous and flexible framework for assessing the temporal and spatial components of estimate errors and their impact on uncertainty in net C uptake by the biosphere. We present a novel approach for incorporating temporally correlated random error into the error structure of emission estimates. Based on this approach, we conclude that the 2σ uncertainties of the atmospheric growth rate have decreased from 1.2 Pg C yr-1 in the 1960s to 0.3 Pg C yr-1 in the 2000s due to an expansion of the atmospheric observation network. The 2σ uncertainties in fossil fuel emissions have increased from 0.3 Pg C yr-1 in the 1960s to almost 1.0 Pg C yr-1 during the 2000s due to differences in national reporting errors and differences in energy inventories. Lastly, while land use emissions have remained fairly constant, their errors still remain high and thus their global C uptake uncertainty is not trivial. Currently, the absolute errors in fossil fuel emissions rival the total emissions from land use, highlighting the extent to which fossil fuels dominate the global C budget. Because errors in the atmospheric growth rate have decreased faster than errors in total emissions have increased, a ~20% reduction in the overall uncertainty of net C global uptake has occurred. Given all the major sources of error in the global C budget that we could identify, we are 93% confident that terrestrial C uptake has increased and 97% confident that ocean C uptake has increased over the last 5 decades. Thus, it is clear that arguably one of the most vital ecosystem services currently provided by the biosphere is the continued removal of approximately half of atmospheric CO2 emissions from the atmosphere

  13. Robotics to Enable Older Adults to Remain Living at Home

    Pearce, Alan J.; Adair, Brooke; Miller, Kimberly; Ozanne, Elizabeth; Said, Catherine; Santamaria, Nick; Morris, Meg E.

    2012-01-01

    Given the rapidly ageing population, interest is growing in robots to enable older people to remain living at home. We conducted a systematic review and critical evaluation of the scientific literature, from 1990 to the present, on the use of robots in aged care. The key research questions were as follows: (1) what is the range of robotic devices available to enable older people to remain mobile, independent, and safe? and, (2) what is the evidence demonstrating that robotic devices are effec...

  14. Summary of existing uncertainty methods

    Glaeser, Horst

    2013-01-01

    A summary of existing and most used uncertainty methods is presented, and the main features are compared. One of these methods is the order statistics method based on Wilks' formula. It is applied in safety research as well as in licensing. This method has been first proposed by GRS for use in deterministic safety analysis, and is now used by many organisations world-wide. Its advantage is that the number of potential uncertain input and output parameters is not limited to a small number. Such a limitation was necessary for the first demonstration of the Code Scaling Applicability Uncertainty Method (CSAU) by the United States Regulatory Commission (USNRC). They did not apply Wilks' formula in their statistical method propagating input uncertainties to obtain the uncertainty of a single output variable, like peak cladding temperature. A Phenomena Identification and Ranking Table (PIRT) was set up in order to limit the number of uncertain input parameters, and consequently, the number of calculations to be performed. Another purpose of such a PIRT process is to identify the most important physical phenomena which a computer code should be suitable to calculate. The validation of the code should be focused on the identified phenomena. Response surfaces are used in some applications replacing the computer code for performing a high number of calculations. The second well known uncertainty method is the Uncertainty Methodology Based on Accuracy Extrapolation (UMAE) and the follow-up method 'Code with the Capability of Internal Assessment of Uncertainty (CIAU)' developed by the University Pisa. Unlike the statistical approaches, the CIAU does compare experimental data with calculation results. It does not consider uncertain input parameters. Therefore, the CIAU is highly dependent on the experimental database. The accuracy gained from the comparison between experimental data and calculated results are extrapolated to obtain the uncertainty of the system code predictions

  15. Uncertainty analysis in safety assessment

    Lemos, Francisco Luiz de [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Sullivan, Terry [Brookhaven National Lab., Upton, NY (United States)

    1997-12-31

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author) 13 refs.; e-mail: lemos at bnl.gov; sulliva1 at bnl.gov

  16. Awe, uncertainty, and agency detection.

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events.

  17. Linear Programming Problems for Generalized Uncertainty

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  18. Future Agribusiness Challenges: Strategic Uncertainty, Innovation and Structural Change

    Boehlje, M.; Roucan-Kane, M.; Bröring, S.

    2011-01-01

    The global food and agribusiness industry is in the midst of major changes, and the pace of change seems to be increasing. These changes suggest three fundamental critical future issues for the sector: 1) decisions must be made in an environment of increasing risk and uncertainty, 2) developing and

  19. Updated Estimates of the Remaining Market Potential of the U.S. ESCO Industry

    Larsen, Peter H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis and Environmental Impacts Div.; Carvallo Bodelon, Juan Pablo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis and Environmental Impacts Div.; Goldman, Charles A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis and Environmental Impacts Div.; Murphy, Sean [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis and Environmental Impacts Div.; Stuart, Elizabeth [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis and Environmental Impacts Div.

    2017-04-01

    The energy service company (ESCO) industry has a well-established track record of delivering energy and economic savings in the public and institutional buildings sector, primarily through the use of performance-based contracts. The ESCO industry often provides (or helps arrange) private sector financing to complete public infrastructure projects with little or no up-front cost to taxpayers. In 2014, total U.S. ESCO industry revenue was estimated at $5.3 billion. ESCOs expect total industry revenue to grow to $7.6 billion in 2017—a 13% annual growth rate from 2015-2017. Researchers at Lawrence Berkeley National Laboratory (LBNL) were asked by the U.S. Department of Energy Federal Energy Management Program (FEMP) to update and expand our estimates of the remaining market potential of the U.S. ESCO industry. We define remaining market potential as the aggregate amount of project investment by ESCOs that is technically possible based on the types of projects that ESCOS have historically implemented in the institutional, commercial, and industrial sectors using ESCO estimates of current market penetration in those sectors. In this analysis, we report U.S. ESCO industry remaining market potential under two scenarios: (1) a base case and (2) a case “unfettered” by market, bureaucratic, and regulatory barriers. We find that there is significant remaining market potential for the U.S. ESCO industry under both the base and unfettered cases. For the base case, we estimate a remaining market potential of $92-$201 billion ($2016). We estimate a remaining market potential of $190-$333 billion for the unfettered case. It is important to note, however, that there is considerable uncertainty surrounding the estimates for both the base and unfettered cases.

  20. Uncertainty, probability and information-gaps

    Ben-Haim, Yakov

    2004-01-01

    This paper discusses two main ideas. First, we focus on info-gap uncertainty, as distinct from probability. Info-gap theory is especially suited for modelling and managing uncertainty in system models: we invest all our knowledge in formulating the best possible model; this leaves the modeller with very faulty and fragmentary information about the variation of reality around that optimal model. Second, we examine the interdependence between uncertainty modelling and decision-making. Good uncertainty modelling requires contact with the end-use, namely, with the decision-making application of the uncertainty model. The most important avenue of uncertainty-propagation is from initial data- and model-uncertainties into uncertainty in the decision-domain. Two questions arise. Is the decision robust to the initial uncertainties? Is the decision prone to opportune windfall success? We apply info-gap robustness and opportunity functions to the analysis of representation and propagation of uncertainty in several of the Sandia Challenge Problems

  1. Uncertainties in predicting rice yield by current crop models under a wide range of climatic conditions

    Li, T.; Hasegawa, T.; Yin, X.; Zhu, Y.; Boote, K.; Adam, M.; Bregaglio, S.; Buis, S.; Confalonieri, R.; Fumoto, T.; Gaydon, D.; Marcaida III, M.; Nakagawa, H.; Oriol, P.; Ruane, A.C.; Ruget, F.; Singh, B.; Singh, U.; Tang, L.; Yoshida, H.; Zhang, Z.; Bouman, B.

    2015-01-01

    Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We

  2. GENERAL RISKS AND UNCERTAINTIES OF REPORTING AND MANAGEMENT REPORTING RISKS

    CAMELIA I. LUNGU

    2011-04-01

    Full Text Available Purpose: Highlighting risks and uncertainties reporting based on a literature review research. Objectives: The delimitation of risk management models and uncertainties in fundamental research. Research method: Fundamental research study directed to identify the relevant risks’ models presented in entities’ financial statements. Uncertainty is one of the fundamental coordinates of our world. As showed J.K. Galbraith (1978, the world now lives under the age of uncertainty. Moreover, we can say that contemporary society development could be achieved by taking decisions under uncertainty, though, risks. Growing concern for the study of uncertainty, its effects and precautions led to the rather recent emergence of a new science, science of hazards (les cindyniques - l.fr. (Kenvern, 1991. Current analysis of risk are dominated by Beck’s (1992 notion that a risk society now exists whereby we have become more concerned about our impact upon nature than the impact of nature upon us. Clearly, risk permeates most aspects of corporate but also of regular life decision-making and few can predict with any precision the future. The risk is almost always a major variable in real-world corporate decision-making, and managers that ignore it are in a real peril. In these circumstances, a possible answer is assuming financial discipline with an appropriate system of incentives.

  3. Compilation of information on uncertainties involved in deposition modeling

    Lewellen, W.S.; Varma, A.K.; Sheng, Y.P.

    1985-04-01

    The current generation of dispersion models contains very simple parameterizations of deposition processes. The analysis here looks at the physical mechanisms governing these processes in an attempt to see if more valid parameterizations are available and what level of uncertainty is involved in either these simple parameterizations or any more advanced parameterization. The report is composed of three parts. The first, on dry deposition model sensitivity, provides an estimate of the uncertainty existing in current estimates of the deposition velocity due to uncertainties in independent variables such as meteorological stability, particle size, surface chemical reactivity and canopy structure. The range of uncertainty estimated for an appropriate dry deposition velocity for a plume generated by a nuclear power plant accident is three orders of magnitude. The second part discusses the uncertainties involved in precipitation scavenging rates for effluents resulting from a nuclear reactor accident. The conclusion is that major uncertainties are involved both as a result of the natural variability of the atmospheric precipitation process and due to our incomplete understanding of the underlying process. The third part involves a review of the important problems associated with modeling the interaction between the atmosphere and a forest. It gives an indication of the magnitude of the problem involved in modeling dry deposition in such environments. Separate analytics have been done for each section and are contained in the EDB

  4. Uncertainty in Simulating Wheat Yields Under Climate Change

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O' Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.

    2013-09-01

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.

  5. Diagnostic uncertainty, guilt, mood, and disability in back pain.

    Serbic, Danijela; Pincus, Tamar; Fife-Schaw, Chris; Dawson, Helen

    2016-01-01

    In the majority of patients a definitive cause for low back pain (LBP) cannot be established, and many patients report feeling uncertain about their diagnosis, accompanied by guilt. The relationship between diagnostic uncertainty, guilt, mood, and disability is currently unknown. This study tested 3 theoretical models to explore possible pathways between these factors. In Model 1, diagnostic uncertainty was hypothesized to correlate with pain-related guilt, which in turn would positively correlate with depression, anxiety and disability. Two alternative models were tested: (a) a path from depression and anxiety to guilt, from guilt to diagnostic uncertainty, and finally to disability; (b) a model in which depression and anxiety, and independently, diagnostic uncertainty, were associated with guilt, which in turn was associated with disability. Structural equation modeling was employed on data from 413 participants with chronic LBP. All 3 models showed a reasonable-to-good fit with the data, with the 2 alternative models providing marginally better fit indices. Guilt, and especially social guilt, was associated with disability in all 3 models. Diagnostic uncertainty was associated with guilt, but only moderately. Low mood was also associated with guilt. Two newly defined factors, pain related guilt and diagnostic uncertainty, appear to be linked to disability and mood in people with LBP. The causal path of these links cannot be established in this cross sectional study. However, pain-related guilt especially appears to be important, and future research should examine whether interventions directly targeting guilt improve outcomes. (c) 2015 APA, all rights reserved).

  6. On uncertainty quantification in hydrogeology and hydrogeophysics

    Linde, Niklas; Ginsbourger, David; Irving, James; Nobile, Fabio; Doucet, Arnaud

    2017-12-01

    Recent advances in sensor technologies, field methodologies, numerical modeling, and inversion approaches have contributed to unprecedented imaging of hydrogeological properties and detailed predictions at multiple temporal and spatial scales. Nevertheless, imaging results and predictions will always remain imprecise, which calls for appropriate uncertainty quantification (UQ). In this paper, we outline selected methodological developments together with pioneering UQ applications in hydrogeology and hydrogeophysics. The applied mathematics and statistics literature is not easy to penetrate and this review aims at helping hydrogeologists and hydrogeophysicists to identify suitable approaches for UQ that can be applied and further developed to their specific needs. To bypass the tremendous computational costs associated with forward UQ based on full-physics simulations, we discuss proxy-modeling strategies and multi-resolution (Multi-level Monte Carlo) methods. We consider Bayesian inversion for non-linear and non-Gaussian state-space problems and discuss how Sequential Monte Carlo may become a practical alternative. We also describe strategies to account for forward modeling errors in Bayesian inversion. Finally, we consider hydrogeophysical inversion, where petrophysical uncertainty is often ignored leading to overconfident parameter estimation. The high parameter and data dimensions encountered in hydrogeological and geophysical problems make UQ a complicated and important challenge that has only been partially addressed to date.

  7. TH-C-BRD-07: Minimizing Dose Uncertainty for Spot Scanning Beam Proton Therapy of Moving Tumor with Optimization of Delivery Sequence

    Li, H; Zhang, X; Zhu, X; Li, Y

    2014-01-01

    Purpose: Intensity modulated proton therapy (IMPT) has been shown to be able to reduce dose to normal tissue compared to intensity modulated photon radio-therapy (IMRT), and has been implemented for selected lung cancer patients. However, respiratory motion-induced dose uncertainty remain one of the major concerns for the radiotherapy of lung cancer, and the utility of IMPT for lung patients was limited because of the proton dose uncertainty induced by motion. Strategies such as repainting and tumor tracking have been proposed and studied but repainting could result in unacceptable long delivery time and tracking is not yet clinically available. We propose a novel delivery strategy for spot scanning proton beam therapy. Method: The effective number of delivery (END) for each spot position in a treatment plan was calculated based on the parameters of the delivery system, including time required for each spot, spot size and energy. The dose uncertainty was then calculated with an analytical formula. The spot delivery sequence was optimized to maximize END and minimize the dose uncertainty. 2D Measurements with a detector array on a 1D moving platform were performed to validate the calculated results. Results: 143 2D measurements on a moving platform were performed for different delivery sequences of a single layer uniform pattern. The measured dose uncertainty is a strong function of the delivery sequence, the worst delivery sequence results in dose error up to 70% while the optimized delivery sequence results in dose error of <5%. END vs. measured dose uncertainty follows the analytical formula. Conclusion: With optimized delivery sequence, it is feasible to minimize the dose uncertainty due to motion in spot scanning proton therapy

  8. WE-B-19A-01: SRT II: Uncertainties in SRT

    Dieterich, S; Schlesinger, D; Geneser, S

    2014-01-01

    SRS delivery has undergone major technical changes in the last decade, transitioning from predominantly frame-based treatment delivery to imageguided, frameless SRS. It is important for medical physicists working in SRS to understand the magnitude and sources of uncertainty involved in delivering SRS treatments for a multitude of technologies (Gamma Knife, CyberKnife, linac-based SRS and protons). Sources of SRS planning and delivery uncertainty include dose calculation, dose fusion, and intra- and inter-fraction motion. Dose calculations for small fields are particularly difficult because of the lack of electronic equilibrium and greater effect of inhomogeneities within and near the PTV. Going frameless introduces greater setup uncertainties that allows for potentially increased intra- and interfraction motion, The increased use of multiple imaging modalities to determine the tumor volume, necessitates (deformable) image and contour fusion, and the resulting uncertainties introduced in the image registration process further contribute to overall treatment planning uncertainties. Each of these uncertainties must be quantified and their impact on treatment delivery accuracy understood. If necessary, the uncertainties may then be accounted for during treatment planning either through techniques to make the uncertainty explicit, or by the appropriate addition of PTV margins. Further complicating matters, the statistics of 1-5 fraction SRS treatments differ from traditional margin recipes relying on Poisson statistics. In this session, we will discuss uncertainties introduced during each step of the SRS treatment planning and delivery process and present margin recipes to appropriately account for such uncertainties. Learning Objectives: To understand the major contributors to the total delivery uncertainty in SRS for Gamma Knife, CyberKnife, and linac-based SRS. Learn the various uncertainties introduced by image fusion, deformable image registration, and contouring

  9. Chapter 3: Traceability and uncertainty

    McEwen, Malcolm

    2014-01-01

    Chapter 3 presents: an introduction; Traceability (measurement standard, role of the Bureau International des Poids et Mesures, Secondary Standards Laboratories, documentary standards and traceability as process review); Uncertainty (Example 1 - Measurement, M raw (SSD), Example 2 - Calibration data, N D.w 60 Co, kQ, Example 3 - Correction factor, P TP ) and Conclusion

  10. Competitive Capacity Investment under Uncertainty

    X. Li (Xishu); R.A. Zuidwijk (Rob); M.B.M. de Koster (René); R. Dekker (Rommert)

    2016-01-01

    textabstractWe consider a long-term capacity investment problem in a competitive market under demand uncertainty. Two firms move sequentially in the competition and a firm’s capacity decision interacts with the other firm’s current and future capacity. Throughout the investment race, a firm can

  11. Uncertainty quantification and error analysis

    Higdon, Dave M [Los Alamos National Laboratory; Anderson, Mark C [Los Alamos National Laboratory; Habib, Salman [Los Alamos National Laboratory; Klein, Richard [Los Alamos National Laboratory; Berliner, Mark [OHIO STATE UNIV.; Covey, Curt [LLNL; Ghattas, Omar [UNIV OF TEXAS; Graziani, Carlo [UNIV OF CHICAGO; Seager, Mark [LLNL; Sefcik, Joseph [LLNL; Stark, Philip [UC/BERKELEY; Stewart, James [SNL

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  12. Numerical modeling of economic uncertainty

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...... are made between alternative modeling methods, and characteristics of the methods are discussed....

  13. Uncertainty covariances in robotics applications

    Smith, D.L.

    1984-01-01

    The application of uncertainty covariance matrices in the analysis of robot trajectory errors is explored. First, relevant statistical concepts are reviewed briefly. Then, a simple, hypothetical robot model is considered to illustrate methods for error propagation and performance test data evaluation. The importance of including error correlations is emphasized

  14. Regulating renewable resources under uncertainty

    Hansen, Lars Gårn

    ) that a pro-quota result under uncertainty about prices and marginal costs is unlikely, requiring that the resource growth function is highly concave locally around the optimum and, 3) that quotas are always preferred if uncertainly about underlying structural economic parameters dominates. These results...... showing that quotas are preferred in a number of situations qualify the pro fee message dominating prior studies....

  15. Uncertainty in the Real World

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Uncertainty in the Real World - Fuzzy Sets. Satish Kumar. General Article Volume 4 Issue 2 February 1999 pp 37-47. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/004/02/0037-0047 ...

  16. Uncertainty of dustfall monitoring results

    Martin A. van Nierop

    2017-06-01

    Full Text Available Fugitive dust has the ability to cause a nuisance and pollute the ambient environment, particularly from human activities including construction and industrial sites and mining operations. As such, dustfall monitoring has occurred for many decades in South Africa; little has been published on the repeatability, uncertainty, accuracy and precision of dustfall monitoring. Repeatability assesses the consistency associated with the results of a particular measurement under the same conditions; the consistency of the laboratory is assessed to determine the uncertainty associated with dustfall monitoring conducted by the laboratory. The aim of this study was to improve the understanding of the uncertainty in dustfall monitoring; thereby improving the confidence in dustfall monitoring. Uncertainty of dustfall monitoring was assessed through a 12-month study of 12 sites that were located on the boundary of the study area. Each site contained a directional dustfall sampler, which was modified by removing the rotating lid, with four buckets (A, B, C and D installed. Having four buckets on one stand allows for each bucket to be exposed to the same conditions, for the same period of time; therefore, should have equal amounts of dust deposited in these buckets. The difference in the weight (mg of the dust recorded from each bucket at each respective site was determined using the American Society for Testing and Materials method D1739 (ASTM D1739. The variability of the dust would provide the confidence level of dustfall monitoring when reporting to clients.

  17. Knowledge Uncertainty and Composed Classifier

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 2 (2007), s. 101-105 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Boosting architecture * contextual modelling * composed classifier * knowledge management, * knowledge * uncertainty Subject RIV: IN - Informatics, Computer Science

  18. Uncertainty propagation in nuclear forensics

    Pommé, S.; Jerome, S.M.; Venchiarutti, C.

    2014-01-01

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent–daughter pairs and the need for more precise half-life data is examined. - Highlights: • Uncertainty propagation formulae for age dating with nuclear chronometers. • Applied to parent–daughter pairs used in nuclear forensics. • Investigated need for better half-life data

  19. WASH-1400: quantifying the uncertainties

    Erdmann, R.C.; Leverenz, F.L. Jr.; Lellouche, G.S.

    1981-01-01

    The purpose of this paper is to focus on the limitations of the WASH-1400 analysis in estimating the risk from light water reactors (LWRs). This assessment attempts to modify the quantification of the uncertainty in and estimate of risk as presented by the RSS (reactor safety study). 8 refs

  20. Forensic considerations when dealing with incinerated human dental remains.

    Reesu, Gowri Vijay; Augustine, Jeyaseelan; Urs, Aadithya B

    2015-01-01

    Establishing the human dental identification process relies upon sufficient post-mortem data being recovered to allow for a meaningful comparison with ante-mortem records of the deceased person. Teeth are the most indestructible components of the human body and are structurally unique in their composition. They possess the highest resistance to most environmental effects like fire, desiccation, decomposition and prolonged immersion. In most natural as well as man-made disasters, teeth may provide the only means of positive identification of an otherwise unrecognizable body. It is imperative that dental evidence should not be destroyed through erroneous handling until appropriate radiographs, photographs, or impressions can be fabricated. Proper methods of physical stabilization of incinerated human dental remains should be followed. The maintenance of integrity of extremely fragile structures is crucial to the successful confirmation of identity. In such situations, the forensic dentist must stabilise these teeth before the fragile remains are transported to the mortuary to ensure preservation of possibly vital identification evidence. Thus, while dealing with any incinerated dental remains, a systematic approach must be followed through each stage of evaluation of incinerated dental remains to prevent the loss of potential dental evidence. This paper presents a composite review of various studies on incinerated human dental remains and discusses their impact on the process of human identification and suggests a step by step approach. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  1. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  2. Estimates of bias and uncertainty in recorded external dose

    Fix, J.J.; Gilbert, E.S.; Baumgartner, W.V.

    1994-10-01

    A study is underway to develop an approach to quantify bias and uncertainty in recorded dose estimates for workers at the Hanford Site based on personnel dosimeter results. This paper focuses on selected experimental studies conducted to better define response characteristics of Hanford dosimeters. The study is more extensive than the experimental studies presented in this paper and includes detailed consideration and evaluation of other sources of bias and uncertainty. Hanford worker dose estimates are used in epidemiologic studies of nuclear workers. A major objective of these studies is to provide a direct assessment of the carcinogenic risk of exposure to ionizing radiation at low doses and dose rates. Considerations of bias and uncertainty in the recorded dose estimates are important in the conduct of this work. The method developed for use with Hanford workers can be considered an elaboration of the approach used to quantify bias and uncertainty in estimated doses for personnel exposed to radiation as a result of atmospheric testing of nuclear weapons between 1945 and 1962. This approach was first developed by a National Research Council (NRC) committee examining uncertainty in recorded film badge doses during atmospheric tests (NRC 1989). It involved quantifying both bias and uncertainty from three sources (i.e., laboratory, radiological, and environmental) and then combining them to obtain an overall assessment. Sources of uncertainty have been evaluated for each of three specific Hanford dosimetry systems (i.e., the Hanford two-element film dosimeter, 1944-1956; the Hanford multi-element film dosimeter, 1957-1971; and the Hanford multi-element TLD, 1972-1993) used to estimate personnel dose throughout the history of Hanford operations. Laboratory, radiological, and environmental sources of bias and uncertainty have been estimated based on historical documentation and, for angular response, on selected laboratory measurements

  3. Uncertainty assessment for accelerator-driven systems

    Finck, P. J.; Gomes, I.; Micklich, B.; Palmiotti, G.

    1999-01-01

    The concept of a subcritical system driven by an external source of neutrons provided by an accelerator ADS (Accelerator Driver System) has been recently revived and is becoming more popular in the world technical community with active programs in Europe, Russia, Japan, and the U.S. A general consensus has been reached in adopting for the subcritical component a fast spectrum liquid metal cooled configuration. Both a lead-bismuth eutectic, sodium and gas are being considered as a coolant; each has advantages and disadvantages. The major expected advantage is that subcriticality avoids reactivity induced transients. The potentially large subcriticality margin also should allow for the introduction of very significant quantities of waste products (minor Actinides and Fission Products) which negatively impact the safety characteristics of standard cores. In the U.S. these arguments are the basis for the development of the Accelerator Transmutation of Waste (ATW), which has significant potential in reducing nuclear waste levels. Up to now, neutronic calculations have not attached uncertainties on the values of the main nuclear integral parameters that characterize the system. Many of these parameters (e.g., degree of subcriticality) are crucial to demonstrate the validity and feasibility of this concept. In this paper we will consider uncertainties related to nuclear data only. The present knowledge of the cross sections of many isotopes that are not usually utilized in existing reactors (like Bi, Pb-207, Pb-208, and also Minor Actinides and Fission Products) suggests that uncertainties in the integral parameters will be significantly larger than for conventional reactor systems, and this raises concerns on the neutronic performance of those systems

  4. Structural Uncertainty in Antarctic sea ice simulations

    Schneider, D. P.

    2016-12-01

    The inability of the vast majority of historical climate model simulations to reproduce the observed increase in Antarctic sea ice has motivated many studies about the quality of the observational record, the role of natural variability versus forced changes, and the possibility of missing or inadequate forcings in the models (such as freshwater discharge from thinning ice shelves or an inadequate magnitude of stratospheric ozone depletion). In this presentation I will highlight another source of uncertainty that has received comparatively little attention: Structural uncertainty, that is, the systematic uncertainty in simulated sea ice trends that arises from model physics and mean-state biases. Using two large ensembles of experiments from the Community Earth System Model (CESM), I will show that the model is predisposed towards producing negative Antarctic sea ice trends during 1979-present, and that this outcome is not simply because the model's decadal variability is out-of-synch with that in nature. In the "Tropical Pacific Pacemaker" ensemble, in which observed tropical Pacific SST anomalies are prescribed, the model produces very realistic atmospheric circulation trends over the Southern Ocean, yet the sea ice trend is negative in every ensemble member. However, if the ensemble-mean trend (commonly interpreted as the forced response) is removed, some ensemble members show a sea ice increase that is very similar to the observed. While this results does confirm the important role of natural variability, it also suggests a strong bias in the forced response. I will discuss the reasons for this systematic bias and explore possible remedies. This an important problem to solve because projections of 21st -Century changes in the Antarctic climate system (including ice sheet surface mass balance changes and related changes in the sea level budget) have a strong dependence on the mean state of and changes in the Antarctic sea ice cover. This problem is not unique to

  5. Development of a remaining lifetime management system for NPPS

    Galvan, J.C.; Regano, M.; Hevia Ruperez, F.

    1994-01-01

    The interest evinced by Spain nuclear power plants in providing a tool to support remaining lifetime management led to UNESA's application to OCIDE in 1992, and the latter's approval, for financing the project to develop a Remaining Lifetime Evaluation System for LWR nuclear power plants. This project is currently being developed under UNESA leadership, and the collaboration of three Spanish engineering companies and a research centre. The paper will describe its objectives, activities, current status and prospects. The project is defined in two phases, the first consisting of the identification and analysis of the main ageing phenomena and their significant parameters and specification of the Remaining Lifetime Evaluation System (RLES), and the second implementation of a pilot application of the RLES to verify its effectiveness. (Author)

  6. Remaining life assessment of a high pressure turbine rotor

    Nguyen, Ninh; Little, Alfie

    2012-01-01

    This paper describes finite element and fracture mechanics based modelling work that provides a useful tool for evaluation of the remaining life of a high pressure (HP) steam turbine rotor that had experienced thermal fatigue cracking. An axis-symmetrical model of a HP rotor was constructed. Steam temperature, pressure and rotor speed data from start ups and shut downs were used for the thermal and stress analysis. Operating history and inspection records were used to benchmark the damage experienced by the rotor. Fracture mechanics crack growth analysis was carried out to evaluate the remaining life of the rotor under themal cyclic loading conditions. The work confirmed that the fracture mechanics approach in conjunction with finite element modelling provides a useful tool for assessing the remaining life of high temperature components in power plants.

  7. On random age and remaining lifetime for populations of items

    Finkelstein, M.; Vaupel, J.

    2015-01-01

    We consider items that are incepted into operation having already a random (initial) age and define the corresponding remaining lifetime. We show that these lifetimes are identically distributed when the age distribution is equal to the equilibrium distribution of the renewal theory. Then we...... develop the population studies approach to the problem and generalize the setting in terms of stationary and stable populations of items. We obtain new stochastic comparisons for the corresponding population ages and remaining lifetimes that can be useful in applications. Copyright (c) 2014 John Wiley...

  8. Stereo-particle image velocimetry uncertainty quantification

    Bhattacharya, Sayantan; Vlachos, Pavlos P; Charonko, John J

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  9. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  10. Marketable pollution permits with uncertainty and transaction costs

    Montero, Juan-Pablo

    1998-01-01

    Increasing interest in the use of marketable permits for pollution control has become evident in recent years. Concern regarding their performance still remains because empirical evidence has shown transaction costs and uncertainty to be significant in past and existing marketable permits programs. In this paper we develop theoretical and numerical models that include transaction costs and uncertainty (in trade approval) to show their effects on market performance (i.e., equilibrium price of permits and trading volume) and aggregate control costs. We also show that in the presence of transaction costs and uncertainty the initial allocation of permits may not be neutral in terms of efficiency. Furthermore, using a numerical model for a hypothetical NO x trading program in which participants have discrete control technology choices, we find that aggregate control costs and the equilibrium price of permits are sensitive to the initial allocation of permits, even for constant marginal transaction costs and certainty

  11. The uncertainty processing theory of motivation.

    Anselme, Patrick

    2010-04-02

    Most theories describe motivation using basic terminology (drive, 'wanting', goal, pleasure, etc.) that fails to inform well about the psychological mechanisms controlling its expression. This leads to a conception of motivation as a mere psychological state 'emerging' from neurophysiological substrates. However, the involvement of motivation in a large number of behavioural parameters (triggering, intensity, duration, and directedness) and cognitive abilities (learning, memory, decision, etc.) suggest that it should be viewed as an information processing system. The uncertainty processing theory (UPT) presented here suggests that motivation is the set of cognitive processes allowing organisms to extract information from the environment by reducing uncertainty about the occurrence of psychologically significant events. This processing of information is shown to naturally result in the highlighting of specific stimuli. The UPT attempts to solve three major problems: (i) how motivations can affect behaviour and cognition so widely, (ii) how motivational specificity for objects and events can result from nonspecific neuropharmacological causal factors (such as mesolimbic dopamine), and (iii) how motivational interactions can be conceived in psychological terms, irrespective of their biological correlates. The UPT is in keeping with the conceptual tradition of the incentive salience hypothesis while trying to overcome the shortcomings inherent to this view. Copyright 2009 Elsevier B.V. All rights reserved.

  12. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  13. Methodology for Extraction of Remaining Sodium of Used Sodium Containers

    Jung, Minhwan; Kim, Jongman; Cho, Youngil; Jeong, Jiyoung

    2014-01-01

    Sodium used as a coolant in the SFR (Sodium-cooled Fast Reactor) reacts easily with most elements due to its high reactivity. If sodium at high temperature leaks outside of a system boundary and makes contact with oxygen, it starts to burn and toxic aerosols are produced. In addition, it generates flammable hydrogen gas through a reaction with water. Hydrogen gas can be explosive within the range of 4.75 vol%. Therefore, the sodium should be handled carefully in accordance with standard procedures even though there is a small amount of target sodium remainings inside the containers and drums used for experiment. After the experiment, all sodium experimental apparatuses should be dismantled carefully through a series of draining, residual sodium extraction, and cleaning if they are no longer reused. In this work, a system for the extraction of the remaining sodium of used sodium drums has been developed and an operation procedure for the system has been established. In this work, a methodology for the extraction of remaining sodium out of the used sodium container has been developed as one of the sodium facility maintenance works. The sodium extraction system for remaining sodium of the used drums was designed and tested successfully. This work will contribute to an establishment of sodium handling technology for PGSFR. (Prototype Gen-IV Sodium-cooled Fast Reactor)

  14. Predicting the Remaining Useful Life of Rolling Element Bearings

    Hooghoudt, Jan Otto; Jantunen, E; Yi, Yang

    2018-01-01

    Condition monitoring of rolling element bearings is of vital importance in order to keep the industrial wheels running. In wind industry this is especially important due to the challenges in practical maintenance. The paper presents an attempt to improve the capability of prediction of remaining...

  15. The experiences of remaining nurse tutors during the transformation ...

    The transformation of public services and education in South Africa is part of the political and socioeconomic transition to democracy. Changes are occurring in every fi eld, including that of the health services. A qualitative study was undertaken to investigate the experiences of the remaining nurse tutors at a school of ...

  16. Remaining childless : Causes and consequences from a life course perspective

    Keizer, R.

    2010-01-01

    Little is know about childless individuals in the Netherlands, although currently one out of every five Dutch individuals remains childless. Who are they? How did they end up being childless? How and to what extent are their life outcomes influenced by their childlessness? By focusing on individual

  17. Molecular genetic identification of skeletal remains of apartheid ...

    The Truth and Reconciliation Commission made significant progress in examining abuses committed during the apartheid era in South Africa. Despite information revealed by the commission, a large number of individuals remained missing when the commission closed its proceedings. This provided the impetus for the ...

  18. Palmar, Patellar, and Pedal Human Remains from Pavlov

    Trinkaus, E.; Wojtal, P.; Wilczyński, J.; Sázelová, Sandra; Svoboda, Jiří

    2017-01-01

    Roč. 2017, June (2017), s. 73-101 ISSN 1545-0031 Institutional support: RVO:68081758 Keywords : Gravettian * human remains * isolated bones * anatomically modern humans * Upper Paleolithic Subject RIV: AC - Archeology, Anthropology, Ethnology OBOR OECD: Archaeology http://paleoanthro.org/media/journal/content/PA20170073.pdf

  19. Robotics to Enable Older Adults to Remain Living at Home

    Alan J. Pearce

    2012-01-01

    Full Text Available Given the rapidly ageing population, interest is growing in robots to enable older people to remain living at home. We conducted a systematic review and critical evaluation of the scientific literature, from 1990 to the present, on the use of robots in aged care. The key research questions were as follows: (1 what is the range of robotic devices available to enable older people to remain mobile, independent, and safe? and, (2 what is the evidence demonstrating that robotic devices are effective in enabling independent living in community dwelling older people? Following database searches for relevant literature an initial yield of 161 articles was obtained. Titles and abstracts of articles were then reviewed by 2 independent people to determine suitability for inclusion. Forty-two articles met the criteria for question 1. Of these, 4 articles met the criteria for question 2. Results showed that robotics is currently available to assist older healthy people and people with disabilities to remain independent and to monitor their safety and social connectedness. Most studies were conducted in laboratories and hospital clinics. Currently limited evidence demonstrates that robots can be used to enable people to remain living at home, although this is an emerging smart technology that is rapidly evolving.

  20. Authentic leadership: becoming and remaining an authentic nurse leader.

    Murphy, Lin G

    2012-11-01

    This article explores how chief nurse executives became and remained authentic leaders. Using narrative inquiry, this qualitative study focused on the life stories of participants. Results demonstrate the importance of reframing, reflection in alignment with values, and the courage needed as nurse leaders progress to authenticity.

  1. Robotics to enable older adults to remain living at home.

    Pearce, Alan J; Adair, Brooke; Miller, Kimberly; Ozanne, Elizabeth; Said, Catherine; Santamaria, Nick; Morris, Meg E

    2012-01-01

    Given the rapidly ageing population, interest is growing in robots to enable older people to remain living at home. We conducted a systematic review and critical evaluation of the scientific literature, from 1990 to the present, on the use of robots in aged care. The key research questions were as follows: (1) what is the range of robotic devices available to enable older people to remain mobile, independent, and safe? and, (2) what is the evidence demonstrating that robotic devices are effective in enabling independent living in community dwelling older people? Following database searches for relevant literature an initial yield of 161 articles was obtained. Titles and abstracts of articles were then reviewed by 2 independent people to determine suitability for inclusion. Forty-two articles met the criteria for question 1. Of these, 4 articles met the criteria for question 2. Results showed that robotics is currently available to assist older healthy people and people with disabilities to remain independent and to monitor their safety and social connectedness. Most studies were conducted in laboratories and hospital clinics. Currently limited evidence demonstrates that robots can be used to enable people to remain living at home, although this is an emerging smart technology that is rapidly evolving.

  2. Dinosaur remains from the type Maastrichtian: An update

    Weishampel, David B.; Mulder, Eric W A; Dortangs, Rudi W.; Jagt, John W M; Jianu, Coralia Maria; Kuypers, Marcel M M; Peeters, Hans H G; Schulp, Anne S.

    1999-01-01

    Isolated cranial and post-cranial remains of hadrosaurid dinosaurs have been collected from various outcrops in the type area of the Maastrichtian stage during the last few years. In the present contribution, dentary and maxillary teeth are recorded from the area for the first time. Post-cranial

  3. Lived Experiences of "Illness Uncertainty" of Iranian Cancer Patients: A Phenomenological Hermeneutic Study.

    Sajjadi, Moosa; Rassouli, Maryam; Abbaszadeh, Abbas; Brant, Jeannine; Majd, Hamid Alavi

    2016-01-01

    For cancer patients, uncertainty is a pervasive experience and a major psychological stressor that affects many aspects of their lives. Uncertainty is a multifaceted concept, and its understanding for patients depends on many factors, including factors associated with various sociocultural contexts. Unfortunately, little is known about the concept of uncertainty in Iranian society and culture. This study aimed to clarify the concept and explain lived experiences of illness uncertainty in Iranian cancer patients. In this hermeneutic phenomenological study, 8 cancer patients participated in semistructured in-depth interviews about their experiences of uncertainty in illness. Interviews continued until data saturation was reached. All interviews were recorded, transcribed, analyzed, and interpreted using 6 stages of the van Manen phenomenological approach. Seven main themes emerged from patients' experiences of illness uncertainty of cancer. Four themes contributed to uncertainty including "Complexity of Cancer," "Confusion About Cancer," "Contradictory Information," and "Unknown Future." Two themes facilitated coping with uncertainty including "Seeking Knowledge" and "Need for Spiritual Peace." One theme, "Knowledge Ambivalence," revealed the struggle between wanting to know and not wanting to know, especially if bad news was delivered. Uncertainty experience for cancer patients in different societies is largely similar. However, some experiences (eg, ambiguity in access to medical resources) seemed unique to Iranian patients. This study provided an outlook of cancer patients' experiences of illness uncertainty in Iran. Cancer patients' coping ability to deal with uncertainty can be improved.

  4. Identifying and Analyzing Uncertainty Structures in the TRMM Microwave Imager Precipitation Product over Tropical Ocean Basins

    Liu, Jianbo; Kummerow, Christian D.; Elsaesser, Gregory S.

    2016-01-01

    Despite continuous improvements in microwave sensors and retrieval algorithms, our understanding of precipitation uncertainty is quite limited, due primarily to inconsistent findings in studies that compare satellite estimates to in situ observations over different parts of the world. This study seeks to characterize the temporal and spatial properties of uncertainty in the Tropical Rainfall Measuring Mission Microwave Imager surface rainfall product over tropical ocean basins. Two uncertainty analysis frameworks are introduced to qualitatively evaluate the properties of uncertainty under a hierarchy of spatiotemporal data resolutions. The first framework (i.e. 'climate method') demonstrates that, apart from random errors and regionally dependent biases, a large component of the overall precipitation uncertainty is manifested in cyclical patterns that are closely related to large-scale atmospheric modes of variability. By estimating the magnitudes of major uncertainty sources independently, the climate method is able to explain 45-88% of the monthly uncertainty variability. The percentage is largely resolution dependent (with the lowest percentage explained associated with a 1 deg x 1 deg spatial/1 month temporal resolution, and highest associated with a 3 deg x 3 deg spatial/3 month temporal resolution). The second framework (i.e. 'weather method') explains regional mean precipitation uncertainty as a summation of uncertainties associated with individual precipitation systems. By further assuming that self-similar recurring precipitation systems yield qualitatively comparable precipitation uncertainties, the weather method can consistently resolve about 50 % of the daily uncertainty variability, with only limited dependence on the regions of interest.

  5. Inherent uncertainties in meteorological parameters for wind turbine design

    Doran, J. C.

    1982-01-01

    Major difficulties associated with meteorological measurments such as the inability to duplicate the experimental conditions from one day to the next are discussed. This lack of consistency is compounded by the stochastic nature of many of the meteorological variables of interest. Moreover, simple relationships derived in one location may be significantly altered by topographical or synoptic differences encountered at another. The effect of such factors is a degree of inherent uncertainty if an attempt is made to describe the atmosphere in terms of universal laws. Some of these uncertainties and their causes are examined, examples are presented and some implications for wind turbine design are suggested.

  6. Applied research in uncertainty modeling and analysis

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  7. Developing scales measuring disorder-specific intolerance of uncertainty (DSIU) : a new perspective on transdiagnostic

    Thibodeau, Michel A; Carleton, R Nicholas; McEvoy, Peter M; Zvolensky, Michael J; Brandt, Charles P; Boelen, Paul A; Mahoney, Alison E J; Deacon, Brett J; Asmundson, Gordon J G

    Intolerance of uncertainty (IU) is a construct of growing prominence in literature on anxiety disorders and major depressive disorder. Existing measures of IU do not define the uncertainty that respondents perceive as distressing. To address this limitation, we developed eight scales measuring

  8. Evaluation of advanced coal gasification combined-cycle systems under uncertainty

    Frey, H.C.; Rubin, E.S.

    1992-01-01

    Advanced integrated gasification combined cycle (IGCC) systems have not been commercially demonstrated, and uncertainties remain regarding their commercial-scale performance and cost. Therefore, a probabilistic evaluation method has been developed and applied to explicitly consider these uncertainties. The insights afforded by this method are illustrated for an IGCC design featuring a fixed-bed gasifier and a hot gas cleanup system. Detailed case studies are conducted to characterize uncertainties in key measures of process performance and cost, evaluate design trade-offs under uncertainty, identify research priorities, evaluate the potential benefits of additional research, compare results for different uncertainty assumptions, and compare the advanced IGCC system to a conventional system under uncertainty. The implications of probabilistic results for research planning and technology selection are discussed in this paper

  9. Uncertainty of the calibration factor

    1995-01-01

    According to present definitions, an error is the difference between a measured value and the ''true'' value. Thus an error has both a numerical value and a sign. In contrast, the uncertainly associated with a measurement is a parameter that characterizes the dispersion of the values ''that could reasonably be attributed to the measurand''. This parameter is normally an estimated standard deviation. An uncertainty, therefore, has no known sign and is usually assumed to be symmetrical. It is a measure of our lack of exact knowledge, after all recognized ''systematic'' effects have been eliminated by applying appropriate corrections. If errors were known exactly, the true value could be determined and there would be no problem left. In reality, errors are estimated in the best possible way and corrections made for them. Therefore, after application of all known corrections, errors need no further consideration (their expectation value being zero) and the only quantities of interest are uncertainties. 3 refs, 2 figs

  10. Uncertainty in hydrological change modelling

    Seaby, Lauren Paige

    applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...

  11. Visualizing Summary Statistics and Uncertainty

    Potter, K.

    2010-08-12

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  12. Visualizing Summary Statistics and Uncertainty

    Potter, K.; Kniss, J.; Riesenfeld, R.; Johnson, C.R.

    2010-01-01

    The graphical depiction of uncertainty information is emerging as a problem of great importance. Scientific data sets are not considered complete without indications of error, accuracy, or levels of confidence. The visual portrayal of this information is a challenging task. This work takes inspiration from graphical data analysis to create visual representations that show not only the data value, but also important characteristics of the data including uncertainty. The canonical box plot is reexamined and a new hybrid summary plot is presented that incorporates a collection of descriptive statistics to highlight salient features of the data. Additionally, we present an extension of the summary plot to two dimensional distributions. Finally, a use-case of these new plots is presented, demonstrating their ability to present high-level overviews as well as detailed insight into the salient features of the underlying data distribution. © 2010 The Eurographics Association and Blackwell Publishing Ltd.

  13. Statistical uncertainties and unrecognized relationships

    Rankin, J.P.

    1985-01-01

    Hidden relationships in specific designs directly contribute to inaccuracies in reliability assessments. Uncertainty factors at the system level may sometimes be applied in attempts to compensate for the impact of such unrecognized relationships. Often uncertainty bands are used to relegate unknowns to a miscellaneous category of low-probability occurrences. However, experience and modern analytical methods indicate that perhaps the dominant, most probable and significant events are sometimes overlooked in statistical reliability assurances. The author discusses the utility of two unique methods of identifying the otherwise often unforeseeable system interdependencies for statistical evaluations. These methods are sneak circuit analysis and a checklist form of common cause failure analysis. Unless these techniques (or a suitable equivalent) are also employed along with the more widely-known assurance tools, high reliability of complex systems may not be adequately assured. This concern is indicated by specific illustrations. 8 references, 5 figures

  14. The uncertainty budget in pharmaceutical industry

    Heydorn, Kaj

    of their uncertainty, exactly as described in GUM [2]. Pharmaceutical industry has therefore over the last 5 years shown increasing interest in accreditation according to ISO 17025 [3], and today uncertainty budgets are being developed for all so-called critical measurements. The uncertainty of results obtained...... that the uncertainty of a particular result is independent of the method used for its estimation. Several examples of uncertainty budgets for critical parameters based on the bottom-up procedure will be discussed, and it will be shown how the top-down method is used as a means of verifying uncertainty budgets, based...

  15. Improvement of uncertainty relations for mixed states

    Park, Yong Moon

    2005-01-01

    We study a possible improvement of uncertainty relations. The Heisenberg uncertainty relation employs commutator of a pair of conjugate observables to set the limit of quantum measurement of the observables. The Schroedinger uncertainty relation improves the Heisenberg uncertainty relation by adding the correlation in terms of anti-commutator. However both relations are insensitive whether the state used is pure or mixed. We improve the uncertainty relations by introducing additional terms which measure the mixtureness of the state. For the momentum and position operators as conjugate observables and for the thermal state of quantum harmonic oscillator, it turns out that the equalities in the improved uncertainty relations hold

  16. Adjoint-Based Uncertainty Quantification with MCNP

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  17. Nuclear Physical Uncertainties in Modeling X-Ray Bursts

    Regis, Eric; Amthor, A. Matthew

    2017-09-01

    Type I x-ray bursts occur when a neutron star accretes material from the surface of another star in a compact binary star system. For certain accretion rates and material compositions, much of the nuclear material is burned in short, explosive bursts. Using a one-dimensional stellar model, Kepler, and a comprehensive nuclear reaction rate library, ReacLib, we have simulated chains of type I x-ray bursts. Unfortunately, there are large remaining uncertainties in the nuclear reaction rates involved, since many of the isotopes reacting are unstable and have not yet been studied experimentally. Some individual reactions, when varied within their estimated uncertainty, alter the light curves dramatically. This limits our ability to understand the structure of the neutron star. Previous studies have looked at the effects of individual reaction rate uncertainties. We have applied a Monte Carlo method ``-simultaneously varying a set of reaction rates'' -in order to probe the expected uncertainty in x-ray burst behaviour due to the total uncertainty in all nuclear reaction rates. Furthermore, we aim to discover any nonlinear effects due to the coupling between different reaction rates. Early results show clear non-linear effects. This research was made possible by NSF-DUE Grant 1317446, BUScholars Program.

  18. Conditional Betas and Investor Uncertainty

    Fernando D. Chague

    2013-01-01

    We derive theoretical expressions for market betas from a rational expectation equilibrium model where the representative investor does not observe if the economy is in a recession or an expansion. Market betas in this economy are time-varying and related to investor uncertainty about the state of the economy. The dynamics of betas will also vary across assets according to the assets' cash-flow structure. In a calibration exercise, we show that value and growth firms have cash-flow structures...

  19. Aggregate Uncertainty, Money and Banking

    Hongfei Sun

    2006-01-01

    This paper studies the problem of monitoring the monitor in a model of money and banking with aggregate uncertainty. It shows that when inside money is required as a means of bank loan repayment, a market of inside money is entailed at the repayment stage and generates information-revealing prices that perfectly discipline the bank. The incentive problem of a bank is costlessly overcome simply by involving inside money in repayment. Inside money distinguishes itself from outside money by its ...

  20. Uncertainty analysis for hot channel

    Panka, I.; Kereszturi, A.

    2006-01-01

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  1. Forecast Accuracy Uncertainty and Momentum

    Bing Han; Dong Hong; Mitch Warachka

    2009-01-01

    We demonstrate that stock price momentum and earnings momentum can result from uncertainty surrounding the accuracy of cash flow forecasts. Our model has multiple information sources issuing cash flow forecasts for a stock. The investor combines these forecasts into an aggregate cash flow estimate that has minimal mean-squared forecast error. This aggregate estimate weights each cash flow forecast by the estimated accuracy of its issuer, which is obtained from their past forecast errors. Mome...

  2. Microeconomic Uncertainty and Macroeconomic Indeterminacy

    Fagnart, Jean-François; Pierrard, Olivier; Sneessens, Henri

    2005-01-01

    The paper proposes a stylized intertemporal macroeconomic model wherein the combination of decentralized trading and microeconomic uncertainty (taking the form of privately observed and uninsured idiosyncratic shocks) creates an information problem between agents and generates indeterminacy of the macroeconomic equilibrium. For a given value of the economic fundamentals, the economy admits a continuum of equilibria that can be indexed by the sales expectations of firms at the time of investme...

  3. LOFT differential pressure uncertainty analysis

    Evans, R.P.; Biladeau, G.L.; Quinn, P.A.

    1977-03-01

    A performance analysis of the LOFT differential pressure (ΔP) measurement is presented. Along with completed descriptions of test programs and theoretical studies that have been conducted on the ΔP, specific sources of measurement uncertainty are identified, quantified, and combined to provide an assessment of the ability of this measurement to satisfy the SDD 1.4.1C (June 1975) requirement of measurement of differential pressure

  4. Knowledge, decision making, and uncertainty

    Fox, J.

    1986-01-01

    Artificial intelligence (AI) systems depend heavily upon the ability to make decisions. Decisions require knowledge, yet there is no knowledge-based theory of decision making. To the extent that AI uses a theory of decision-making it adopts components of the traditional statistical view in which choices are made by maximizing some function of the probabilities of decision options. A knowledge-based scheme for reasoning about uncertainty is proposed, which extends the traditional framework but is compatible with it

  5. Accommodating Uncertainty in Prior Distributions

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-19

    A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.

  6. Indirect Nitrous Oxide Emissions from Major Rivers in the World: Integration of a Process-based Model with Observational Data

    Zhang, B.; Yao, Y.; Xu, R.; Yang, J.; WANG, Z.; Pan, S.; Tian, H.

    2016-12-01

    The atmospheric concentration of nitrous oxide (N2O), one of major greenhouse gases, has increased over 121% compared with the preindustrial level, and most of the increase arises from anthropogenic activities. Previous studies suggested that indirect emissions from global rivers remains a large source of uncertainty among all the N2O sources and restricted the assessment of N2O budget at both regional and global scales. Here, we have integrated a coupled biogeochemical model (DLEM) with observational data to quantify the magnitude and spatio-temporal variation of riverine N2O emission and attribute the environmental controls of indirect N2O emission from major rivers in the world. Our preliminary results indicate that the magnitude of indirect N2O emission from rivers is closely associated with the stream orders. To include N2O emissions from headwater streams is essential for reducing uncertainty in the estimation of indirect N2O emission. By implementing a set of factorial simulations, we have further quantified the relative contributions of climate, nitrogen deposition, nitrogen fertilizer use, and manure application to riverine N2O emission. Finally, this study has identified major knowledge gaps and uncertainties associated with model structure, parameters and input data that need to be improved in future research.

  7. Safety provision for nuclear power plants during remaining running time

    Rossnagel, Alexander; Hentschel, Anja

    2012-01-01

    With the phasing-out of the industrial use of nuclear energy for the power generation, the risk of the nuclear power plants has not been eliminated in principle, but only for a limited period of time. Therefore, the remaining nine nuclear power plants must also be used for the remaining ten years according to the state of science and technology. Regulatory authorities must substantiate the safety requirements for each nuclear power plant and enforce these requirements by means of various regulatory measures. The consequences of Fukushima must be included in the assessment of the safety level of nuclear power plants in Germany. In this respect, the regulatory authorities have the important tasks to investigate and assess the security risks as well as to develop instructions and orders.

  8. Structural remains at the early mediaeval fort at Raibania, Orissa

    Bratati Sen

    2013-11-01

    Full Text Available The fortifications of mediaeval India occupy an eminent position in the history of military architecture. The present paper deals with the preliminary study of the structural remains at the early mediaeval fort at Raibania in the district of Balasore in Orissa. The fort was built of stone very loosely kept together. The three-walled fortification interspersed by two consecutive moats, a feature evidenced at Raibania, which is unparallel in the history of ancient and mediaeval forts and fortifications in India. Several other structures like the Jay-Chandi Temple Complex, a huge well, numerous tanks and remains of an ancient bridge add to the uniqueness of the Fort in the entire eastern region.

  9. Mineral remains of early life on Earth? On Mars?

    Iberall, Robbins E.; Iberall, A.S.

    1991-01-01

    The oldest sedimentary rocks on Earth, the 3.8-Ga Isua Iron-Formation in southwestern Greenland, are metamorphosed past the point where organic-walled fossils would remain. Acid residues and thin sections of these rocks reveal ferric microstructures that have filamentous, hollow rod, and spherical shapes not characteristic of crystalline minerals. Instead, they resemble ferric-coated remains of bacteria. Because there are no earlier sedimentary rocks to study on Earth, it may be necessary to expand the search elsewhere in the solar system for clues to any biotic precursors or other types of early life. A study of morphologies of iron oxide minerals collected in the southern highlands during a Mars sample return mission may therefore help to fill in important gaps in the history of Earth's earliest biosphere. -from Authors

  10. Managing project risks and uncertainties

    Mike Mentis

    2015-01-01

    Full Text Available This article considers threats to a project slipping on budget, schedule and fit-for-purpose. Threat is used here as the collective for risks (quantifiable bad things that can happen and uncertainties (poorly or not quantifiable bad possible events. Based on experience with projects in developing countries this review considers that (a project slippage is due to uncertainties rather than risks, (b while eventuation of some bad things is beyond control, managed execution and oversight are still the primary means to keeping within budget, on time and fit-for-purpose, (c improving project delivery is less about bigger and more complex and more about coordinated focus, effectiveness and developing thought-out heuristics, and (d projects take longer and cost more partly because threat identification is inaccurate, the scope of identified threats is too narrow, and the threat assessment product is not integrated into overall project decision-making and execution. Almost by definition, what is poorly known is likely to cause problems. Yet it is not just the unquantifiability and intangibility of uncertainties causing project slippage, but that they are insufficiently taken into account in project planning and execution that cause budget and time overruns. Improving project performance requires purpose-driven and managed deployment of scarce seasoned professionals. This can be aided with independent oversight by deeply experienced panelists who contribute technical insights and can potentially show that diligence is seen to be done.

  11. Chemical model reduction under uncertainty

    Malpica Galassi, Riccardo

    2017-03-06

    A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.

  12. USING CONDITION MONITORING TO PREDICT REMAINING LIFE OF ELECTRIC CABLES

    LOFARO, R.; SOO, P.; VILLARAN, M.; GROVE, E.

    2001-01-01

    Electric cables are passive components used extensively throughout nuclear power stations to perform numerous safety and non-safety functions. It is known that the polymers commonly used to insulate the conductors on these cables can degrade with time; the rate of degradation being dependent on the severity of the conditions in which the cables operate. Cables do not receive routine maintenance and, since it can be very costly, they are not replaced on a regular basis. Therefore, to ensure their continued functional performance, it would be beneficial if condition monitoring techniques could be used to estimate the remaining useful life of these components. A great deal of research has been performed on various condition monitoring techniques for use on electric cables. In a research program sponsored by the U.S. Nuclear Regulatory Commission, several promising techniques were evaluated and found to provide trendable information on the condition of low-voltage electric cables. These techniques may be useful for predicting remaining life if well defined limiting values for the aging properties being measured can be determined. However, each technique has advantages and limitations that must be addressed in order to use it effectively, and the necessary limiting values are not always easy to obtain. This paper discusses how condition monitoring measurements can be used to predict the remaining useful life of electric cables. The attributes of an appropriate condition monitoring technique are presented, and the process to be used in estimating the remaining useful life of a cable is discussed along with the difficulties that must be addressed

  13. Study on remain actinides recovery in pyro reprocessing

    Suharto, Bambang

    1996-01-01

    The spent fuel reprocessing by dry process called pyro reprocessing have been studied. Most of U, Pu and MA (minor actinides) from the spent fuel will be recovered and be fed back to the reactor as new fuel. Accumulation of remain actinides will be separated by extraction process with liquid cadmium solvent. The research was conducted by computer simulation to calculate the stage number required. The calculation's results showed on the 20 stages extractor more than 99% actinides can be separated. (author)

  14. Structural remains at the early mediaeval fort at Raibania, Orissa

    Sen, Bratati

    2013-01-01

    The fortifications of mediaeval India occupy an eminent position in the history of military architecture. The present paper deals with the preliminary study of the structural remains at the early mediaeval fort at Raibania in the district of Balasore in Orissa. The fort was built of stone very loosely kept together. The three-walled fortification interspersed by two consecutive moats, a feature evidenced at Raibania, w...

  15. Neanderthal infant and adult infracranial remains from Marillac (Charente, France).

    Dolores Garralda, María; Maureille, Bruno; Vandermeersch, Bernard

    2014-09-01

    At the site of Marillac, near the Ligonne River in Marillac-le-Franc (Charente, France), a remarkable stratigraphic sequence has yielded a wealth of archaeological information, palaeoenvironmental data, as well as faunal and human remains. Marillac must have been a sinkhole used by Neanderthal groups as a hunting camp during MIS 4 (TL date 57,600 ± 4,600BP), where Quina Mousterian lithics and fragmented bones of reindeer predominate. This article describes three infracranial skeleton fragments. Two of them are from adults and consist of the incomplete shafts of a right radius (Marillac 24) and a left fibula (Marillac 26). The third fragment is the diaphysis of the right femur of an immature individual (Marillac 25), the size and shape of which resembles those from Teshik-Tash and could be assigned to a child of a similar age. The three fossils have been compared with the remains of other Neanderthals or anatomically Modern Humans (AMH). Furthermore, the comparison of the infantile femora, Marillac 25 and Teshik-Tash, with the remains of several European children from the early Middle Ages clearly demonstrates the robustness and rounded shape of both Neanderthal diaphyses. Evidence of peri-mortem manipulations have been identified on all three bones, with spiral fractures, percussion pits and, in the case of the radius and femur, unquestionable cutmarks made with flint implements, probably during defleshing. Traces of periostosis appear on the fibula fragment and on the immature femoral diaphysis, although their aetiology remains unknown. Copyright © 2014 Wiley Periodicals, Inc.

  16. Prognostic modelling options for remaining useful life estimation by industry

    Sikorska, J. Z.; Hodkiewicz, M.; Ma, L.

    2011-07-01

    Over recent years a significant amount of research has been undertaken to develop prognostic models that can be used to predict the remaining useful life of engineering assets. Implementations by industry have only had limited success. By design, models are subject to specific assumptions and approximations, some of which are mathematical, while others relate to practical implementation issues such as the amount of data required to validate and verify a proposed model. Therefore, appropriate model selection for successful practical implementation requires not only a mathematical understanding of each model type, but also an appreciation of how a particular business intends to utilise a model and its outputs. This paper discusses business issues that need to be considered when selecting an appropriate modelling approach for trial. It also presents classification tables and process flow diagrams to assist industry and research personnel select appropriate prognostic models for predicting the remaining useful life of engineering assets within their specific business environment. The paper then explores the strengths and weaknesses of the main prognostics model classes to establish what makes them better suited to certain applications than to others and summarises how each have been applied to engineering prognostics. Consequently, this paper should provide a starting point for young researchers first considering options for remaining useful life prediction. The models described in this paper are Knowledge-based (expert and fuzzy), Life expectancy (stochastic and statistical), Artificial Neural Networks, and Physical models.

  17. Remaining useful life estimation based on discriminating shapelet extraction

    Malinowski, Simon; Chebel-Morello, Brigitte; Zerhouni, Noureddine

    2015-01-01

    In the Prognostics and Health Management domain, estimating the remaining useful life (RUL) of critical machinery is a challenging task. Various research topics including data acquisition, fusion, diagnostics and prognostics are involved in this domain. This paper presents an approach, based on shapelet extraction, to estimate the RUL of equipment. This approach extracts, in an offline step, discriminative rul-shapelets from an history of run-to-failure data. These rul-shapelets are patterns that are selected for their correlation with the remaining useful life of the equipment. In other words, every selected rul-shapelet conveys its own information about the RUL of the equipment. In an online step, these rul-shapelets are compared to testing units and the ones that match these units are used to estimate their RULs. Therefore, RUL estimation is based on patterns that have been selected for their high correlation with the RUL. This approach is different from classical similarity-based approaches that attempt to match complete testing units (or only late instants of testing units) with training ones to estimate the RUL. The performance of our approach is evaluated on a case study on the remaining useful life estimation of turbofan engines and performance is compared with other similarity-based approaches. - Highlights: • A data-driven RUL estimation technique based on pattern extraction is proposed. • Patterns are extracted for their correlation with the RUL. • The proposed method shows good performance compared to other techniques

  18. Direct dating of Early Upper Palaeolithic human remains from Mladec.

    Wild, Eva M; Teschler-Nicola, Maria; Kutschera, Walter; Steier, Peter; Trinkaus, Erik; Wanek, Wolfgang

    2005-05-19

    The human fossil assemblage from the Mladec Caves in Moravia (Czech Republic) has been considered to derive from a middle or later phase of the Central European Aurignacian period on the basis of archaeological remains (a few stone artefacts and organic items such as bone points, awls, perforated teeth), despite questions of association between the human fossils and the archaeological materials and concerning the chronological implications of the limited archaeological remains. The morphological variability in the human assemblage, the presence of apparently archaic features in some specimens, and the assumed early date of the remains have made this fossil assemblage pivotal in assessments of modern human emergence within Europe. We present here the first successful direct accelerator mass spectrometry radiocarbon dating of five representative human fossils from the site. We selected sample materials from teeth and from one bone for 14C dating. The four tooth samples yielded uncalibrated ages of approximately 31,000 14C years before present, and the bone sample (an ulna) provided an uncertain more-recent age. These data are sufficient to confirm that the Mladec human assemblage is the oldest cranial, dental and postcranial assemblage of early modern humans in Europe and is therefore central to discussions of modern human emergence in the northwestern Old World and the fate of the Neanderthals.

  19. Remaining life diagnosis method and device for nuclear reactor

    Yamamoto, Michiyoshi.

    1996-01-01

    A neutron flux measuring means is inserted from the outside of a reactor pressure vessel during reactor operation to forecast neutron-degradation of materials of incore structural components in the vicinity of portions to be measured based on the measured values, and the remaining life of the reactor is diagnosed by the forecast degraded state. In this case, the neutron fluxes to be measured are desirably fast and/or medium neutron fluxes. As the positions where the measuring means is to be inserted, for example, the vicinity of the structural components at the periphery of the fuel assembly is selected. Aging degradation characteristics of the structural components are determined by using the aging degradation data for the structural materials. The remaining life is analyzed based on obtained aging degradation characteristics and stress evaluation data of the incore structural components at portions to be measured. Neutron irradiation amount of structural components at predetermined positions can be recognized accurately, and appropriate countermeasures can be taken depending on the forecast remaining life thereby enabling to improve the reliability of the reactor. (N.H.)

  20. Postmortem Scavenging of Human Remains by Domestic Cats

    Ananya Suntirukpong, M.D.

    2017-11-01

    Full Text Available Objective: Crime scene investigators, forensic medicine doctors and pathologists, and forensic anthropologists frequently encounter postmortem scavenging of human remains by household pets. Case presentation: The authors present a case report of a partially skeletonized adult male found dead after more than three months in his apartment in Thailand. The body was in an advanced stage of decomposition with nearly complete skeletonization of the head, neck, hands, and feet. The presence of maggots and necrophagous (flesh eating beetles on the body confirmed that insects had consumed much of the soft tissues. Examination of the hand and foot bones revealed canine tooth puncture marks. Evidence of chewing indicated that one or more of the decedent’s three house cats had fed on the body after death. Recognizing and identifying carnivore and rodent activity on the soft flesh and bones of human remains is important in interpreting and reconstructing postmortem damage. Thorough analysis may help explain why skeletal elements are missing, damaged, or out of anatomical position. Conclusion: This report presents a multi-disciplinary approach combining forensic anthropology and forensic medicine in examining and interpreting human remains.

  1. Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer

    Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain

    2015-09-01

    Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.

  2. Material aging and degradation detection and remaining life assessment for plant life management

    Ramuhalli, P.; Henager, C.H. Jr.; Griffin, J.W.; Meyer, R.M.; Coble, J.B.; Pitman, S.G.; Bond, L.J.

    2012-01-01

    One of the major factors that may impact long-term operations is structural material degradation. Detecting materials degradation, estimating the remaining useful life (RUL) of the component, and determining approaches to mitigating the degradation are important from the perspective of long-term operations. In this study, multiple nondestructive measurement and monitoring methods were evaluated for their ability to assess the material degradation state. Metrics quantifying the level of damage from these measurements were defined and evaluated for their ability to provide estimates of remaining life of the component. An example of estimating the RUL from nondestructive measurements of material degradation condition is provided. (author)

  3. Remaining lifetime modelling for replacement of power transformer populations

    Schijndel, van A.; Wetzer, J.; Wouters, P.A.A.F.

    2008-01-01

    The age of the majority of power transformers applied in the western electricity network varies between 25 and 50 years. Depending on the load history and time of operation, replacement on short term is imminent. A technically sound policy concerning the replacement of these assets must be based on

  4. Quantifying chemical uncertainties in simulations of the ISM

    Glover, Simon

    2018-06-01

    The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.

  5. High cumulants of conserved charges and their statistical uncertainties

    Li-Zhu, Chen; Ye-Yin, Zhao; Xue, Pan; Zhi-Ming, Li; Yuan-Fang, Wu

    2017-10-01

    We study the influence of measured high cumulants of conserved charges on their associated statistical uncertainties in relativistic heavy-ion collisions. With a given number of events, the measured cumulants randomly fluctuate with an approximately normal distribution, while the estimated statistical uncertainties are found to be correlated with corresponding values of the obtained cumulants. Generally, with a given number of events, the larger the cumulants we measure, the larger the statistical uncertainties that are estimated. The error-weighted averaged cumulants are dependent on statistics. Despite this effect, however, it is found that the three sigma rule of thumb is still applicable when the statistics are above one million. Supported by NSFC (11405088, 11521064, 11647093), Major State Basic Research Development Program of China (2014CB845402) and Ministry of Science and Technology (MoST) (2016YFE0104800)

  6. Alignment measurements uncertainties for large assemblies using probabilistic analysis techniques

    AUTHOR|(CDS)2090816; Almond, Heather

    Big science and ambitious industrial projects continually push forward with technical requirements beyond the grasp of conventional engineering techniques. Example of those are ultra-high precision requirements in the field of celestial telescopes, particle accelerators and aerospace industry. Such extreme requirements are limited largely by the capability of the metrology used, namely, it’s uncertainty in relation to the alignment tolerance required. The current work was initiated as part of Maria Curie European research project held at CERN, Geneva aiming to answer those challenges as related to future accelerators requiring alignment of 2 m large assemblies to tolerances in the 10 µm range. The thesis has found several gaps in current knowledge limiting such capability. Among those was the lack of application of state of the art uncertainty propagation methods in alignment measurements metrology. Another major limiting factor found was the lack of uncertainty statements in the thermal errors compensatio...

  7. Uncertainty and Risk Assessment in the Design Process for Wind

    Damiani, Rick R. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2018-02-09

    This report summarizes the concepts and opinions that emerged from an initial study on the subject of uncertainty in wind design that included expert elicitation during a workshop held at the National Wind Technology Center at the National Renewable Energy Laboratory July 12-13, 2016. In this paper, five major categories of uncertainties are identified. The first category is associated with direct impacts on turbine loads, (i.e., the inflow including extreme events, aero-hydro-servo-elastic response, soil-structure inter- action, and load extrapolation). The second category encompasses material behavior and strength. Site suitability and due-diligence aspects pertain to the third category. Calibration of partial safety factors and optimal reliability levels make up the fourth one. And last but not least, is the category associated with uncertainties in computational modeling. The main sections of this paper follow this organization.

  8. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  9. Measurement uncertainties for vacuum standards at Korea Research Institute of Standards and Science

    Hong, S. S.; Shin, Y. H.; Chung, K. H.

    2006-01-01

    The Korea Research Institute of Standards and Science has three major vacuum systems: an ultrasonic interferometer manometer (UIM) (Sec. II, Figs. 1 and 2) for low vacuum, a static expansion system (SES) (Sec. III, Figs. 3 and 4) for medium vacuum, and an orifice-type dynamic expansion system (DES) (Sec. IV, Figs. 5 and 6) for high and ultrahigh vacuum. For each system explicit measurement model equations with multiple variables are, respectively, given. According to ISO standards, all these system variable errors were used to calculate the expanded uncertainty (U). For each system the expanded uncertainties (k=1, confidence level=95%) and relative expanded uncertainty (expanded uncertainty/generated pressure) are summarized in Table IV and are estimated to be as follows. For UIM, at 2.5-300 Pa generated pressure, the expanded uncertainty is -2 Pa and the relative expanded uncertainty is -2 ; at 1-100 kPa generated pressure, the expanded uncertainty is -5 . For SES, at 3-100 Pa generated pressure, the expanded uncertainty is -1 Pa and the relative expanded uncertainty is -3 . For DES, at 4.6x10 -3 -1.3x10 -2 Pa generated pressure, the expanded uncertainty is -4 Pa and the relative expanded uncertainty is -3 ; at 3.0x10 -6 -9.0x10 -4 Pa generated pressure, the expanded uncertainty is -6 Pa and the relative expanded uncertainty is -2 . Within uncertainty limits our bilateral and key comparisons [CCM.P-K4 (10 Pa-1 kPa)] are extensive and in good agreement with those of other nations (Fig. 8 and Table V)

  10. Uncertainty estimation in nuclear power plant probabilistic safety assessment

    Guarro, S.B.; Cummings, G.E.

    1989-01-01

    Probabilistic Risk Assessment (PRA) was introduced in the nuclear industry and the nuclear regulatory process in 1975 with the publication of the Reactor Safety Study by the U.S. Nuclear Regulatory Commission. Almost fifteen years later, the state-of-the-art in this field has been expanded and sharpened in many areas, and about thirty-five plant-specific PRAs (Probabilistic Risk Assessments) have been performed by the nuclear utility companies or by the U.S. Nuclear Regulatory commission. Among the areas where the most evident progress has been made in PRA and PSA (Probabilistic Safety Assessment, as these studies are more commonly referred to in the international community outside the U.S.) is the development of a consistent framework for the identification of sources of uncertainty and the estimation of their magnitude as it impacts various risk measures. Techniques to propagate uncertainty in reliability data through the risk models and display its effect on the top level risk estimates were developed in the early PRAs. The Seismic Safety Margin Research Program (SSMRP) study was the first major risk study to develop an approach to deal explicitly with uncertainty in risk estimates introduced not only by uncertainty in component reliability data, but by the incomplete state of knowledge of the assessor(s) with regard to basic phenomena that may trigger and drive a severe accident. More recently NUREG-1150, another major study of reactor risk sponsored by the NRC, has expanded risk uncertainty estimation and analysis into the realm of model uncertainty related to the relatively poorly known post-core-melt phenomena which determine the behavior of the molten core and of the rector containment structures

  11. Validation uncertainty of MATRA code for subchannel void distributions

    Hwang, Dae-Hyun; Kim, S. J.; Kwon, H.; Seo, K. W. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To extend code capability to the whole core subchannel analysis, pre-conditioned Krylov matrix solvers such as BiCGSTAB and GMRES are implemented in MATRA code as well as parallel computing algorithms using MPI and OPENMP. It is coded by fortran 90, and has some user friendly features such as graphic user interface. MATRA code was approved by Korean regulation body for design calculation of integral-type PWR named SMART. The major role subchannel code is to evaluate core thermal margin through the hot channel analysis and uncertainty evaluation for CHF predictions. In addition, it is potentially used for the best estimation of core thermal hydraulic field by incorporating into multiphysics and/or multi-scale code systems. In this study we examined a validation process for the subchannel code MATRA specifically in the prediction of subchannel void distributions. The primary objective of validation is to estimate a range within which the simulation modeling error lies. The experimental data for subchannel void distributions at steady state and transient conditions was provided on the framework of OECD/NEA UAM benchmark program. The validation uncertainty of MATRA code was evaluated for a specific experimental condition by comparing the simulation result and experimental data. A validation process should be preceded by code and solution verification. However, quantification of verification uncertainty was not addressed in this study. The validation uncertainty of the MATRA code for predicting subchannel void distribution was evaluated for a single data point of void fraction measurement at a 5x5 PWR test bundle on the framework of OECD UAM benchmark program. The validation standard uncertainties were evaluated as 4.2%, 3.9%, and 2.8% with the Monte-Carlo approach at the axial levels of 2216 mm, 2669 mm, and 3177 mm, respectively. The sensitivity coefficient approach revealed similar results of uncertainties but did not account for the nonlinear effects on the

  12. Uncertainty analysis of the FRAP code

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the FRAP code (Fuel Rod Analysis Program) and applied to a PWR fuel rod undergoing a LOCA. The method of uncertainty analysis is the Response Surface Method (RSM). (author)

  13. Two multi-dimensional uncertainty relations

    Skala, L; Kapsa, V

    2008-01-01

    Two multi-dimensional uncertainty relations, one related to the probability density and the other one related to the probability density current, are derived and discussed. Both relations are stronger than the usual uncertainty relations for the coordinates and momentum

  14. Change and uncertainty in quantum systems

    Franson, J.D.

    1996-01-01

    A simple inequality shows that any change in the expectation value of an observable quantity must be associated with some degree of uncertainty. This inequality is often more restrictive than the Heisenberg uncertainty principle. copyright 1996 The American Physical Society

  15. Measure of uncertainty in regional grade variability

    Tutmez, B.; Kaymak, U.; Melin, P.; Castillo, O.; Gomez Ramirez, E.; Kacprzyk, J.; Pedrycz, W.

    2007-01-01

    Because the geological events are neither homogeneous nor isotropic, the geological investigations are characterized by particularly high uncertainties. This paper presents a hybrid methodology for measuring of uncertainty in regional grade variability. In order to evaluate the fuzziness in grade

  16. Climate change and global crop yield: impacts, uncertainties and adaptation

    Deryng, Delphine

    2014-01-01

    As global mean temperature continues to rise steadily, agricultural systems are projected to face unprecedented challenges to cope with climate change. However, understanding of climate change impacts on global crop yield, and of farmers’ adaptive capacity, remains incomplete as previous global assessments: (1) inadequately evaluated the role of extreme weather events; (2) focused on a small subset of the full range of climate change predictions; (3) overlooked uncertainties related to the ch...

  17. Fuel cycle cost uncertainty from nuclear fuel cycle comparison

    Li, J.; McNelis, D.; Yim, M.S.

    2013-01-01

    This paper examined the uncertainty in fuel cycle cost (FCC) calculation by considering both model and parameter uncertainty. Four different fuel cycle options were compared in the analysis including the once-through cycle (OT), the DUPIC cycle, the MOX cycle and a closed fuel cycle with fast reactors (FR). The model uncertainty was addressed by using three different FCC modeling approaches with and without the time value of money consideration. The relative ratios of FCC in comparison to OT did not change much by using different modeling approaches. This observation was consistent with the results of the sensitivity study for the discount rate. Two different sets of data with uncertainty range of unit costs were used to address the parameter uncertainty of the FCC calculation. The sensitivity study showed that the dominating contributor to the total variance of FCC is the uranium price. In general, the FCC of OT was found to be the lowest followed by FR, MOX, and DUPIC. But depending on the uranium price, the FR cycle was found to have lower FCC over OT. The reprocessing cost was also found to have a major impact on FCC

  18. Role of uncertainty in the basalt waste isolation project

    Knepp, A.J.; Dahlem, D.H.

    1989-01-01

    The current national Civilian Radioactive Waste Management (CRWM) Program to select a mined geologic repository will likely require the extensive use of probabilistic techniques to quantify uncertainty in predictions of repository isolation performance. Performance of nonhomogeneous, geologic hydrologic, and chemical systems must be predicted over time frames of thousands of years and therefore will likely contain significant uncertainty. A qualitative assessment of our limited ability to interrogate the site in a nondestructive manner coupled with the early stage of development in the pertinent geosciences support this statement. The success of the approach to incorporate what currently appears to be an appreciable element of uncertainty into the predictions of repository performance will play an important role in acquiring a license to operate and in establishing the level of safety associated with the concept of long-term geologic storage of nuclear waste. This paper presents a brief background on the Hanford Site and the repository program, references the sources that establish the legislative requirement to quantify uncertainties in performance predictions, and summarized the present and future program at the Hanford Site in this area. The decision to quantify significant sources of uncertainties has had a major impact on the direction of the site characterization program here at Hanford. The paper concludes with a number of observations on the impacts of this decision

  19. Generally Recognized as Safe: Uncertainty Surrounding E-Cigarette Flavoring Safety

    Clara G. Sears

    2017-10-01

    Full Text Available Despite scientific uncertainty regarding the relative safety of inhaling e-cigarette aerosol and flavorings, some consumers regard the U.S. Food and Drug Administration’s “generally recognized as safe” (GRAS designation as evidence of flavoring safety. In this study, we assessed how college students’ perceptions of e-cigarette flavoring safety are related to understanding of the GRAS designation. During spring 2017, an online questionnaire was administered to college students. Chi-square p-values and multivariable logistic regression were employed to compare perceptions among participants considering e-cigarette flavorings as safe and those considering e-cigarette flavorings to be unsafe. The total sample size was 567 participants. Only 22% knew that GRAS designation meant that a product is safe to ingest, not inhale, inject, or use topically. Of participants who considered flavorings to be GRAS, the majority recognized that the designation meant a product is safe to ingest but also considered it safe to inhale. Although scientific uncertainty on the overall safety of flavorings in e-cigarettes remains, health messaging can educate the public about the GRAS designation and its irrelevance to e-cigarette safety.

  20. Future Simulated Intensification of Precipitation Extremes, CMIP5 Model Uncertainties and Dependencies

    Bador, M.; Donat, M.; Geoffroy, O.; Alexander, L. V.

    2017-12-01

    Precipitation intensity during extreme events is expected to increase with climate change. Throughout the 21st century, CMIP5 climate models project a general increase in annual extreme precipitation in most regions. We investigate how robust this future increase is across different models, regions and seasons. We find that there is strong similarity in extreme precipitation changes between models that share atmospheric physics, reducing the ensemble of 27 models to 14 independent projections. We find that future simulated extreme precipitation increases in most models in the majority of land grid cells located in the dry, intermediate and wet regions according to each model's precipitation climatology. These increases significantly exceed the range of natural variability estimated from long equilibrium control runs. The intensification of extreme precipitation across the entire spectrum of dry to wet regions is particularly robust in the extra-tropics in both wet and dry season, whereas uncertainties are larger in the tropics. The CMIP5 ensemble therefore indicates robust future intensification of annual extreme rainfall in particular in extra-tropical regions. Generally, the CMIP5 robustness is higher during the dry season compared to the wet season and the annual scale, but inter-model uncertainties in the tropics remain important.

  1. Spontaneous recovery of locomotion induced by remaining fibers after spinal cord transection in adult rats.

    You, Si-Wei; Chen, Bing-Yao; Liu, Hui-Ling; Lang, Bing; Xia, Jie-Lai; Jiao, Xi-Ying; Ju, Gong

    2003-01-01

    A major issue in analysis of experimental results after spinal cord injury is spontaneous functional recovery induced by remaining nerve fibers. The authors investigated the relationship between the degree of locomotor recovery and the percentage and location of the fibers that spared spinal cord transection. The spinal cords of 12 adult rats were transected at T9 with a razor blade, which often resulted in sparing of nerve fibers in the ventral spinal cord. The incompletely-transected animals were used to study the degree of spontaneous recovery of hindlimb locomotion, evaluated with the BBB rating scale, in correlation to the extent and location of the remaining fibers. Incomplete transection was found in the ventral spinal cord in 42% of the animals. The degree of locomotor recovery was highly correlated with the percentage of the remaining fibers in the ventral and ventrolateral funiculi. In one of the rats, 4.82% of remaining fibers in unilateral ventrolateral funiculus were able to sustain a certain recovery of locomotion. Less than 5% of remaining ventrolateral white matter is sufficient for an unequivocal motor recovery after incomplete spinal cord injury. Therefore, for studies with spinal cord transection, the completeness of sectioning should be carefully checked before any conclusion can be reached. The fact that the degree of locomotor recovery is correlated with the percentage of remaining fibers in the ventrolateral spinal cord, exclusive of most of the descending motor tracts, may imply an essential role of propriospinal connections in the initiation of spontaneous locomotor recovery.

  2. Uncertainty and its propagation in dynamics models

    Devooght, J.

    1994-01-01

    The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a open-quotes subdynamicsclose quotes where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision

  3. Some illustrative examples of model uncertainty

    Bier, V.M.

    1994-01-01

    In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion

  4. The Uncertainty Multiplier and Business Cycles

    Saijo, Hikaru

    2013-01-01

    I study a business cycle model where agents learn about the state of the economy by accumulating capital. During recessions, agents invest less, and this generates noisier estimates of macroeconomic conditions and an increase in uncertainty. The endogenous increase in aggregate uncertainty further reduces economic activity, which in turn leads to more uncertainty, and so on. Thus, through changes in uncertainty, learning gives rise to a multiplier effect that amplifies business cycles. I use ...

  5. Yellow Fever Remains a Potential Threat to Public Health.

    Vasconcelos, Pedro F C; Monath, Thomas P

    2016-08-01

    Yellow fever (YF) remains a serious public health threat in endemic countries. The recent re-emergence in Africa, initiating in Angola and spreading to Democratic Republic of Congo and Uganda, with imported cases in China and Kenya is of concern. There is such a shortage of YF vaccine in the world that the World Health Organization has proposed the use of reduced doses (1/5) during emergencies. In this short communication, we discuss these and other problems including the risk of spread of YF to areas free of YF for decades or never before affected by this arbovirus disease.

  6. The Artificial Leaf: Recent Progress and Remaining Challenges

    Mark D Symes

    2016-12-01

    Full Text Available The prospect of a device that uses solar energy to split water into H2 and O2 is highly attractive in terms of producing hydrogen as a carbon-neutral fuel. In this mini review, key research milestones that have been reached in this field over the last two decades will be discussed, with special focus on devices that use earth-abundant materials. Finally, the remaining challenges in the development of such “artificial leaves” will be highlighted.

  7. Leprosy: ancient disease remains a public health problem nowadays.

    Noriega, Leandro Fonseca; Chiacchio, Nilton Di; Noriega, Angélica Fonseca; Pereira, Gilmayara Alves Abreu Maciel; Vieira, Marina Lino

    2016-01-01

    Despite being an ancient disease, leprosy remains a public health problem in several countries -particularly in India, Brazil and Indonesia. The current operational guidelines emphasize the evaluation of disability from the time of diagnosis and stipulate as fundamental principles for disease control: early detection and proper treatment. Continued efforts are needed to establish and improve quality leprosy services. A qualified primary care network that is integrated into specialized service and the development of educational activities are part of the arsenal in the fight against the disease, considered neglected and stigmatizing.

  8. Studies on protozoa in ancient remains - A Review

    Liesbeth Frías

    2013-02-01

    Full Text Available Paleoparasitological research has made important contributions to the understanding of parasite evolution and ecology. Although parasitic protozoa exhibit a worldwide distribution, recovering these organisms from an archaeological context is still exceptional and relies on the availability and distribution of evidence, the ecology of infectious diseases and adequate detection techniques. Here, we present a review of the findings related to protozoa in ancient remains, with an emphasis on their geographical distribution in the past and the methodologies used for their retrieval. The development of more sensitive detection methods has increased the number of identified parasitic species, promising interesting insights from research in the future.

  9. Encephalitozoon cuniculi in Raw Cow's Milk Remains Infectious After Pasteurization.

    Kváč, Martin; Tomanová, Vendula; Samková, Eva; Koubová, Jana; Kotková, Michaela; Hlásková, Lenka; McEvoy, John; Sak, Bohumil

    2016-02-01

    This study describes the prevalence of Encephalitozoon cuniculi in raw cow's milk and evaluates the effect of different milk pasteurization treatments on E. cuniculi infectivity for severe combined immunodeficient (SCID) mice. Using a nested polymerase chain reaction approach, 1 of 50 milking cows was found to repeatedly shed E. cuniculi in its feces and milk. Under experimental conditions, E. cuniculi spores in milk remained infective for SCID mice following pasteurization treatments at 72 °C for 15 s or 85 °C for 5 s. Based on these findings, pasteurized cow's milk should be considered a potential source of E. cuniculi infection in humans.

  10. "Recent" macrofossil remains from the Lomonosov Ridge, central Arctic Ocean

    Le Duc, Cynthia; de Vernal, Anne; Archambault, Philippe; Brice, Camille; Roberge, Philippe

    2016-04-01

    The examination of surface sediment samples collected from 17 sites along the Lomonosov Ridge at water depths ranging from 737 to 3339 meters during Polarstern Expedition PS87 in 2014 (Stein, 2015), indicates a rich biogenic content almost exclusively dominated by calcareous remains. Amongst biogenic remains, microfossils (planktic and benthic foraminifers, pteropods, ostracods, etc.) dominate but millimetric to centrimetric macrofossils occurred frequently at the surface of the sediment. The macrofossil remains consist of a large variety of taxa, including gastropods, bivalvia, polychaete tubes, scaphopods, echinoderm plates and spines, and fish otoliths. Among the Bivalvia, the most abundant taxa are Portlandia arctica, Hyalopecten frigidus, Cuspidaria glacilis, Policordia densicostata, Bathyarca spp., and Yoldiella spp. Whereas a few specimens are well preserved and apparently pristine, most mollusk shells displayed extensive alteration features. Moreover, most shells were covered by millimeter scale tubes of the serpulid polychaete Spirorbis sp. suggesting transport from low intertidal or subtidal zone. Both the ecological affinity and known geographic distribution of identified bivalvia as named above support the hypothesis of transportation rather than local development. In addition to mollusk shells, more than a hundred fish otoliths were recovered in surface sediments. The otoliths mostly belong to the Gadidae family. Most of them are well preserved and without serpulid tubes attached to their surface, suggesting a local/regional origin, unlike the shell remains. Although recovered at the surface, the macrofaunal assemblages of the Lomonosov Ridge do not necessarily represent the "modern" environments as they may result from reworking and because their occurrence at the surface of the sediment may also be due to winnowing of finer particles. Although the shells were not dated, we suspect that their actual ages may range from modern to several thousands of

  11. Assessment of major nuclear technologies with decision and risk analysis

    Winterfeldt, D. von

    1995-01-01

    Selecting technologies for major nuclear programs involves several complexities, including multiple stakeholders, multiple conflicting objectives, uncertainties, and risk. In addition, the programmatic risks related to the schedule, cost, and performance of these technologies often become major issues in the selection process. This paper describes a decision analysis approach for addressing these complexities in a logical manner

  12. Uncertainty Characterization of Reactor Vessel Fracture Toughness

    Li, Fei; Modarres, Mohammad

    2002-01-01

    To perform fracture mechanics analysis of reactor vessel, fracture toughness (K Ic ) at various temperatures would be necessary. In a best estimate approach, K Ic uncertainties resulting from both lack of sufficient knowledge and randomness in some of the variables of K Ic must be characterized. Although it may be argued that there is only one type of uncertainty, which is lack of perfect knowledge about the subject under study, as a matter of practice K Ic uncertainties can be divided into two types: aleatory and epistemic. Aleatory uncertainty is related to uncertainty that is very difficult to reduce, if not impossible; epistemic uncertainty, on the other hand, can be practically reduced. Distinction between aleatory and epistemic uncertainties facilitates decision-making under uncertainty and allows for proper propagation of uncertainties in the computation process. Typically, epistemic uncertainties representing, for example, parameters of a model are sampled (to generate a 'snapshot', single-value of the parameters), but the totality of aleatory uncertainties is carried through the calculation as available. In this paper a description of an approach to account for these two types of uncertainties associated with K Ic has been provided. (authors)

  13. Uncertainty in prediction and in inference

    Hilgevoord, J.; Uffink, J.

    1991-01-01

    The concepts of uncertainty in prediction and inference are introduced and illustrated using the diffraction of light as an example. The close re-lationship between the concepts of uncertainty in inference and resolving power is noted. A general quantitative measure of uncertainty in

  14. Entropic uncertainty relations-a survey

    Wehner, Stephanie; Winter, Andreas

    2010-01-01

    Uncertainty relations play a central role in quantum mechanics. Entropic uncertainty relations in particular have gained significant importance within quantum information, providing the foundation for the security of many quantum cryptographic protocols. Yet, little is known about entropic uncertainty relations with more than two measurement settings. In the present survey, we review known results and open questions.

  15. Flood modelling : Parameterisation and inflow uncertainty

    Mukolwe, M.M.; Di Baldassarre, G.; Werner, M.; Solomatine, D.P.

    2014-01-01

    This paper presents an analysis of uncertainty in hydraulic modelling of floods, focusing on the inaccuracy caused by inflow errors and parameter uncertainty. In particular, the study develops a method to propagate the uncertainty induced by, firstly, application of a stage–discharge rating curve

  16. Determination of Remaining Useful Life of Gas Turbine Blade

    Meor Said Mior Azman

    2016-01-01

    Full Text Available The aim of this research is to determine the remaining useful life of gas turbine blade, using service-exposed turbine blades. This task is performed using Stress Rupture Test (SRT under accelerated test conditions where the applied stresses to the specimen is between 400 MPa to 600 MPa and the test temperature is 850°C. The study will focus on the creep behaviour of the 52000 hours service-exposed blades, complemented with creep-rupture modelling using JMatPro software and microstructure examination using optical microscope. The test specimens, made up of Ni-based superalloy of the first stage turbine blades, are machined based on International Standard (ISO 24. The results from the SRT will be analyzed using these two main equations – Larson-Miller Parameter and Life Fraction Rule. Based on the results of the remaining useful life analysis, the 52000h service-exposed blade has the condition to operate in the range of another 4751 hr to 18362 hr. The microstructure examinations shows traces of carbide precipitation that deteriorate the grain boundaries that occurs during creep process. Creep-rupture life modelling using JMatPro software has shown good agreement with the accelerated creep rupture test with minimal error.

  17. A method for defleshing human remains using household bleach.

    Mann, Robert W; Berryman, Hugh E

    2012-03-01

    Medical examiners and forensic anthropologists are often faced with the difficult task of removing soft tissue from the human skeleton without damaging the bones, teeth and, in some cases, cartilage. While there are a number of acceptable methods that can be used to remove soft tissue including macerating in water, simmering or boiling, soaking in ammonia, removing with scissors, knife, scalpel or stiff brush, and dermestid beetles, each has its drawback in time, safety, or potential to damage bone. This technical report using the chest plate of a stabbing victim presents a safe and effective alternative method for removing soft tissue from human remains, in particular the chest plate, following autopsy, without damaging or separating the ribs, sternum, and costal cartilage. This method can be used to reveal subtle blunt force trauma to bone, slicing and stabbing injuries, and other forms of trauma obscured by overlying soft tissue. Despite the published cautionary notes, when done properly household bleach (3-6% sodium hypochlorite) is a quick, safe, and effective method for examining cartilage and exposing skeletal trauma by removing soft tissue from human skeletal remains. 2011 American Academy of Forensic Sciences. Published 2011. This article is a U.S. Government work and is in the public domain in the U.S.A.

  18. Duplex Alu Screening for Degraded DNA of Skeletal Human Remains

    Fabian Haß

    2017-10-01

    Full Text Available The human-specific Alu elements, belonging to the class of Short INterspersed Elements (SINEs, have been shown to be a powerful tool for population genetic studies. An earlier study in this department showed that it was possible to analyze Alu presence/absence in 3000-year-old skeletal human remains from the Bronze Age Lichtenstein cave in Lower Saxony, Germany. We developed duplex Alu screening PCRs with flanking primers for two Alu elements, each combined with a single internal Alu primer. By adding an internal primer, the approximately 400–500 bp presence signals of Alu elements can be detected within a range of less than 200 bp. Thus, our PCR approach is suited for highly fragmented ancient DNA samples, whereas NGS analyses frequently are unable to handle repetitive elements. With this analysis system, we examined remains of 12 individuals from the Lichtenstein cave with different degrees of DNA degradation. The duplex PCRs showed fully informative amplification results for all of the chosen Alu loci in eight of the 12 samples. Our analysis system showed that Alu presence/absence analysis is possible in samples with different degrees of DNA degradation and it reduces the amount of valuable skeletal material needed by a factor of four, as compared with a singleplex approach.

  19. Major Sport Venues

    Department of Homeland Security — The Major Public Venues dataset is composed of facilities that host events for the National Association for Stock Car Auto Racing, Indy Racing League, Major League...

  20. Major Depression Among Adults

    ... Depressive Episode Among Adolescents Data Sources Share Major Depression Definitions Major depression is one of the most ... Bethesda, MD 20892-9663 Follow Us Facebook Twitter YouTube Google Plus NIMH Newsletter NIMH RSS Feed NIMH ...

  1. Failure probability under parameter uncertainty.

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  2. Quantum Uncertainty and Fundamental Interactions

    Tosto S.

    2013-04-01

    Full Text Available The paper proposes a simplified theoretical approach to infer some essential concepts on the fundamental interactions between charged particles and their relative strengths at comparable energies by exploiting the quantum uncertainty only. The worth of the present approach relies on the way of obtaining the results, rather than on the results themselves: concepts today acknowledged as fingerprints of the electroweak and strong interactions appear indeed rooted in the same theoretical frame including also the basic principles of special and general relativity along with the gravity force.

  3. Uncertainty analysis in seismic tomography

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  4. Modelling of Transport Projects Uncertainties

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  5. Medical Need, Equality, and Uncertainty.

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality. © 2016 John Wiley & Sons Ltd.

  6. On use of radial evanescence remain term in kinematic hardening

    Geyer, P.

    1995-10-01

    A fine modelling of the material' behaviour can be necessary to study the mechanical strength of nuclear power plant' components under cyclic loads. Ratchetting is one of the last phenomena for which numerical models have to be improved. We discuss in this paper on use of radial evanescence remain term in kinematic hardening to improve the description of ratchetting in biaxial loading tests. It's well known that Chaboche elastoplastic model with two non linear kinematic hardening variables initially proposed by Armstrong and Frederick, usually over-predicts accumulation of ratchetting strain. Burlet and Cailletaud proposed in 1987 a non linear kinematic rule with a radial evanescence remain term. The two models lead to identical formulation for proportional loadings. In the case of a biaxial loading test (primary+secondary loading), Burlet and Cailletaud model leads to accommodation, when Chaboche one's leads to ratchetting with a constant increment of strain. So we can have an under-estimate with the first model and an over-estimate with the second. An easy method to improve the description of ratchetting is to combine the two kinematic rules. Such an idea is already used by Delobelle in his model. With analytical results in the case of tension-torsion tests, we show in a first part of the paper, the interest of radial evanescence remain term in the non linear kinematic rule to describe ratchetting: we give the conditions to get adaptation, accommodation or ratchetting and the value of the strain increment in the last case. In the second part of the paper, we propose to modify the elastoplastic Chaboche model by coupling the two types of hardening by means of two scalar parameters which can be identified independently on biaxial loading tests. Identification of these two parameters returns to speculate on the directions of strain in order to adjust the ratchetting to experimental observations. We use the experimental results on the austenitic steel 316L at room

  7. Highly efficient DNA extraction method from skeletal remains

    Irena Zupanič Pajnič

    2011-03-01

    Full Text Available Background: This paper precisely describes the method of DNA extraction developed to acquire high quality DNA from the Second World War skeletal remains. The same method is also used for molecular genetic identification of unknown decomposed bodies in routine forensic casework where only bones and teeth are suitable for DNA typing. We analysed 109 bones and two teeth from WWII mass graves in Slovenia. Methods: We cleaned the bones and teeth, removed surface contaminants and ground the bones into powder, using liquid nitrogen . Prior to isolating the DNA in parallel using the BioRobot EZ1 (Qiagen, the powder was decalcified for three days. The nuclear DNA of the samples were quantified by real-time PCR method. We acquired autosomal genetic profiles and Y-chromosome haplotypes of the bones and teeth with PCR amplification of microsatellites, and mtDNA haplotypes 99. For the purpose of traceability in the event of contamination, we prepared elimination data bases including genetic profiles of the nuclear and mtDNA of all persons who have been in touch with the skeletal remains in any way. Results: We extracted up to 55 ng DNA/g of the teeth, up to 100 ng DNA/g of the femurs, up to 30 ng DNA/g of the tibias and up to 0.5 ng DNA/g of the humerus. The typing of autosomal and YSTR loci was successful in all of the teeth, in 98 % dekalof the femurs, and in 75 % to 81 % of the tibias and humerus. The typing of mtDNA was successful in all of the teeth, and in 96 % to 98 % of the bones. Conclusions: We managed to obtain nuclear DNA for successful STR typing from skeletal remains that were over 60 years old . The method of DNA extraction described here has proved to be highly efficient. We obtained 0.8 to 100 ng DNA/g of teeth or bones and complete genetic profiles of autosomal DNA, Y-STR haplotypes, and mtDNA haplotypes from only 0.5g bone and teeth samples.

  8. TMI in perspective: reactor containment stands up, difficult decisions remain

    Corey, G.R.

    1979-01-01

    Commonwealth Edison Co. is increasing its commitment to nuclear energy after reviewing the performance of the Three Mile Island reactor containment systems. Both the reactor vessel and the secondary containment remained intact and no radiation was reported in the soil or water. The public discussion of energy options which followed the accident will benefit both the public and technical community even if there is a temporary slowdown in nuclear power development. The realities of energy supplies have become evident; i.e., that nuclear and coal are the only available options for the short-term. The discussion should also lead to better personnel training, regulatory reforms, risk-sharing insurance, and international standards. The public hysteria triggered by the accident stemmed partly from the combination of unfortunate incidents and the media coverage, which led to hasty conclusions

  9. Oldest Directly Dated Remains of Sheep in China

    Dodson, John; Dodson, Eoin; Banati, Richard; Li, Xiaoqiang; Atahan, Pia; Hu, Songmei; Middleton, Ryan J.; Zhou, Xinying; Nan, Sun

    2014-11-01

    The origins of domesticated sheep (Ovis sp.) in China remain unknown. Previous workers have speculated that sheep may have been present in China up to 7000 years ago, however many claims are based on associations with archaeological material rather than independent dates on sheep material. Here we present 7 radiocarbon dates on sheep bone from Inner Mongolia, Ningxia and Shaanxi provinces. DNA analysis on one of the bones confirms it is Ovis sp. The oldest ages are about 4700 to 4400 BCE and are thus the oldest objectively dated Ovis material in eastern Asia. The graphitisised bone collagen had δ13C values indicating some millet was represented in the diet. This probably indicates sheep were in a domestic setting where millet was grown. The younger samples had δ13C values indicating that even more millet was in the diet, and this was likely related to changes in foddering practices

  10. On use of radial evanescence remain term in kinematic hardening

    Geyer, P.

    1995-01-01

    This paper presents the interest which lies in non-linear kinematic hardening rule with radial evanescence remain term as proposed for modelling multiaxial ratchetting. From analytical calculations in the case of the tension/torsion test, this ratchetting is compared with that proposed by Armstrong and Frederick. A modification is then proposed for Chaboche's elastoplastic model with two non-linear kinematic variables, by coupling the two types of hardening by means of two scalar parameters. Identification of these two parameters returns to speculate on the directions of strain in order to adjust the ratchetting to experimental observations. Using biaxial ratchetting tests on stainless steel 316 L specimens at ambient temperature, it is shown that satisfactory modelling of multiaxial ratchetting is obtained. (author). 4 refs., 5 figs

  11. Psychotherapy for Borderline Personality Disorder: Progress and Remaining Challenges.

    Links, Paul S; Shah, Ravi; Eynan, Rahel

    2017-03-01

    The main purpose of this review was to critically evaluate the literature on psychotherapies for borderline personality disorder (BPD) published over the past 5 years to identify the progress with remaining challenges and to determine priority areas for future research. A systematic review of the literature over the last 5 years was undertaken. The review yielded 184 relevant abstracts, and after applying inclusion criteria, 16 articles were fully reviewed based on the articles' implications for future research and/or clinical practice. Our review indicated that patients with various severities benefited from psychotherapy; more intensive therapies were not significantly superior to less intensive therapies; enhancing emotion regulation processes and fostering more coherent self-identity were important mechanisms of change; therapies had been extended to patients with BPD and posttraumatic stress disorder; and more research was needed to be directed at functional outcomes.

  12. [Alcohol and work: remaining sober and return to work].

    Vittadini, G; Bandirali, M

    2007-01-01

    One of the most complex alcohol-driven problems is the job loss and the subsequent attempts to return to a professional activity. In order to better understand the issue, an epidemiologic investigation was carried out on a group of 162 alcoholics whilst hospitalised in a specialised clinic. The outcome shows the importance of remaining sober to keep or to be returned to one's own job. Unfortunately, local resources at hand, first of all joining an auto-mutual-help group, re still too little known and thus clearly underemployed. Therefore, an informative action within companies is highly desirable. Those alcoholics suffering from serious illnesses, especially mental ones represent a different issue. For these people a higher involvement of public authorities is desirable in creating protected job openings.

  13. Differential Decomposition Among Pig, Rabbit, and Human Remains.

    Dautartas, Angela; Kenyhercz, Michael W; Vidoli, Giovanna M; Meadows Jantz, Lee; Mundorff, Amy; Steadman, Dawnie Wolfe

    2018-03-30

    While nonhuman animal remains are often utilized in forensic research to develop methods to estimate the postmortem interval, systematic studies that directly validate animals as proxies for human decomposition are lacking. The current project compared decomposition rates among pigs, rabbits, and humans at the University of Tennessee's Anthropology Research Facility across three seasonal trials that spanned nearly 2 years. The Total Body Score (TBS) method was applied to quantify decomposition changes and calculate the postmortem interval (PMI) in accumulated degree days (ADD). Decomposition trajectories were analyzed by comparing the estimated and actual ADD for each seasonal trial and by fuzzy cluster analysis. The cluster analysis demonstrated that the rabbits formed one group while pigs and humans, although more similar to each other than either to rabbits, still showed important differences in decomposition patterns. The decomposition trends show that neither nonhuman model captured the pattern, rate, and variability of human decomposition. © 2018 American Academy of Forensic Sciences.

  14. The economic implications of carbon cycle uncertainty

    Smith, Steven J.; Edmonds, James A.

    2006-01-01

    This paper examines the implications of uncertainty in the carbon cycle for the cost of stabilizing carbon dioxide concentrations. Using a state of the art integrated assessment model, we find that uncertainty in our understanding of the carbon cycle has significant implications for the costs of a climate stabilization policy, with cost differences denominated in trillions of dollars. Uncertainty in the carbon cycle is equivalent to a change in concentration target of up to 100 ppmv. The impact of carbon cycle uncertainties are smaller than those for climate sensitivity, and broadly comparable to the effect of uncertainty in technology availability

  15. Uncertainty budget for k0-NAA

    Robouch, P.; Arana, G.; Eguskiza, M.; Etxebarria, N.

    2000-01-01

    The concepts of the Guide to the expression of Uncertainties in Measurements for chemical measurements (GUM) and the recommendations of the Eurachem document 'Quantifying Uncertainty in Analytical Methods' are applied to set up the uncertainty budget for k 0 -NAA. The 'universally applicable spreadsheet technique', described by KRAGTEN, is applied to the k 0 -NAA basic equations for the computation of uncertainties. The variance components - individual standard uncertainties - highlight the contribution and the importance of the different parameters to be taken into account. (author)

  16. Uncertainty Communication. Issues and good practice

    Kloprogge, P.; Van der Sluijs, J.; Wardekker, A.

    2007-12-01

    In 2003 the Netherlands Environmental Assessment Agency (MNP) published the RIVM/MNP Guidance for Uncertainty Assessment and Communication. The Guidance assists in dealing with uncertainty in environmental assessments. Dealing with uncertainty is essential because assessment results regarding complex environmental issues are of limited value if the uncertainties have not been taken into account adequately. A careful analysis of uncertainties in an environmental assessment is required, but even more important is the effective communication of these uncertainties in the presentation of assessment results. The Guidance yields rich and differentiated insights in uncertainty, but the relevance of this uncertainty information may vary across audiences and uses of assessment results. Therefore, the reporting of uncertainties is one of the six key issues that is addressed in the Guidance. In practice, users of the Guidance felt a need for more practical assistance in the reporting of uncertainty information. This report explores the issue of uncertainty communication in more detail, and contains more detailed guidance on the communication of uncertainty. In order to make this a 'stand alone' document several questions that are mentioned in the detailed Guidance have been repeated here. This document thus has some overlap with the detailed Guidance. Part 1 gives a general introduction to the issue of communicating uncertainty information. It offers guidelines for (fine)tuning the communication to the intended audiences and context of a report, discusses how readers of a report tend to handle uncertainty information, and ends with a list of criteria that uncertainty communication needs to meet to increase its effectiveness. Part 2 helps writers to analyze the context in which communication takes place, and helps to map the audiences, and their information needs. It further helps to reflect upon anticipated uses and possible impacts of the uncertainty information on the

  17. Premortal data in the process of skeletal remains identification

    Marinković Nadica

    2012-01-01

    Full Text Available Background/Aim. The basic task of a forensic examiner during the exhumation of mass graves or in mass accidents is to establish identity of a person. The results obtained through these procedures depend on the level of perceptibility of post mortal changes and they are compared with premortal data obtained from family members of those missing or killed. Experience with exhumations has shown significant differences between the results obtained through exhumation and the premortal data. The aim of the study was to suggest the existance of the difference between premortal data and the results obtained by exhumation regarding the some parameters, as well as to direct premortal data colection to the specific skeletal forms. Methods. We performed comparative analysis of the results of exhumation of skeletal remains in a mass grave and the premortal data concerning the identified persons. The least number of individuals in this mass grave was calculated according to the upper parts of the right femur and it helped in calculating the smallest number of individuals in mass graves to be 48. A total of 27 persons were identified. Sex was determined by metrics and morphology of the pelvis. Personal age in the moment of death was determined by morphology features of groin symphisis and morphology of sternal edge of ribs and other parts of scelets observations. The hight was calculated as average results of length of long bones and Rollet coefficients. Results. There was a complete match in terms of sex and age matched within an interval that could be established based on the skeletal remains. All the other parameters were different, however, which made identification significantly more difficult. Conclusion. The premortal data is an important element of identification process and it should be obtained by the forensic doctor and directed towards more detailed examination of the skeletal system.

  18. Reidentification of avian embryonic remains from the cretaceous of mongolia.

    Varricchio, David J; Balanoff, Amy M; Norell, Mark A

    2015-01-01

    Embryonic remains within a small (4.75 by 2.23 cm) egg from the Late Cretaceous, Mongolia are here re-described. High-resolution X-ray computed tomography (HRCT) was used to digitally prepare and describe the enclosed embryonic bones. The egg, IGM (Mongolian Institute for Geology, Ulaanbaatar) 100/2010, with a three-part shell microstructure, was originally assigned to Neoceratopsia implying extensive homoplasy among eggshell characters across Dinosauria. Re-examination finds the forelimb significantly longer than the hindlimbs, proportions suggesting an avian identification. Additional, postcranial apomorphies (strut-like coracoid, cranially located humeral condyles, olecranon fossa, slender radius relative to the ulna, trochanteric crest on the femur, and ulna longer than the humerus) identify the embryo as avian. Presence of a dorsal coracoid fossa and a craniocaudally compressed distal humerus with a strongly angled distal margin support a diagnosis of IGM 100/2010 as an enantiornithine. Re-identification eliminates the implied homoplasy of this tri-laminate eggshell structure, and instead associates enantiornithine birds with eggshell microstructure composed of a mammillary, squamatic, and external zones. Posture of the embryo follows that of other theropods with fore- and hindlimbs folded parallel to the vertebral column and the elbow pointing caudally just dorsal to the knees. The size of the egg and embryo of IGM 100/2010 is similar to the two other Mongolian enantiornithine eggs. Well-ossified skeletons, as in this specimen, characterize all known enantiornithine embryos suggesting precocial hatchlings, comparing closely to late stage embryos of modern precocial birds that are both flight- and run-capable upon hatching. Extensive ossification in enantiornithine embryos may contribute to their relatively abundant representation in the fossil record. Neoceratopsian eggs remain unrecognized in the fossil record.

  19. Reidentification of avian embryonic remains from the cretaceous of mongolia.

    David J Varricchio

    Full Text Available Embryonic remains within a small (4.75 by 2.23 cm egg from the Late Cretaceous, Mongolia are here re-described. High-resolution X-ray computed tomography (HRCT was used to digitally prepare and describe the enclosed embryonic bones. The egg, IGM (Mongolian Institute for Geology, Ulaanbaatar 100/2010, with a three-part shell microstructure, was originally assigned to Neoceratopsia implying extensive homoplasy among eggshell characters across Dinosauria. Re-examination finds the forelimb significantly longer than the hindlimbs, proportions suggesting an avian identification. Additional, postcranial apomorphies (strut-like coracoid, cranially located humeral condyles, olecranon fossa, slender radius relative to the ulna, trochanteric crest on the femur, and ulna longer than the humerus identify the embryo as avian. Presence of a dorsal coracoid fossa and a craniocaudally compressed distal humerus with a strongly angled distal margin support a diagnosis of IGM 100/2010 as an enantiornithine. Re-identification eliminates the implied homoplasy of this tri-laminate eggshell structure, and instead associates enantiornithine birds with eggshell microstructure composed of a mammillary, squamatic, and external zones. Posture of the embryo follows that of other theropods with fore- and hindlimbs folded parallel to the vertebral column and the elbow pointing caudally just dorsal to the knees. The size of the egg and embryo of IGM 100/2010 is similar to the two other Mongolian enantiornithine eggs. Well-ossified skeletons, as in this specimen, characterize all known enantiornithine embryos suggesting precocial hatchlings, comparing closely to late stage embryos of modern precocial birds that are both flight- and run-capable upon hatching. Extensive ossification in enantiornithine embryos may contribute to their relatively abundant representation in the fossil record. Neoceratopsian eggs remain unrecognized in the fossil record.

  20. Cost-effective conservation of an endangered frog under uncertainty.

    Rose, Lucy E; Heard, Geoffrey W; Chee, Yung En; Wintle, Brendan A

    2016-04-01

    How should managers choose among conservation options when resources are scarce and there is uncertainty regarding the effectiveness of actions? Well-developed tools exist for prioritizing areas for one-time and binary actions (e.g., protect vs. not protect), but methods for prioritizing incremental or ongoing actions (such as habitat creation and maintenance) remain uncommon. We devised an approach that combines metapopulation viability and cost-effectiveness analyses to select among alternative conservation actions while accounting for uncertainty. In our study, cost-effectiveness is the ratio between the benefit of an action and its economic cost, where benefit is the change in metapopulation viability. We applied the approach to the case of the endangered growling grass frog (Litoria raniformis), which is threatened by urban development. We extended a Bayesian model to predict metapopulation viability under 9 urbanization and management scenarios and incorporated the full probability distribution of possible outcomes for each scenario into the cost-effectiveness analysis. This allowed us to discern between cost-effective alternatives that were robust to uncertainty and those with a relatively high risk of failure. We found a relatively high risk of extinction following urbanization if the only action was reservation of core habitat; habitat creation actions performed better than enhancement actions; and cost-effectiveness ranking changed depending on the consideration of uncertainty. Our results suggest that creation and maintenance of wetlands dedicated to L. raniformis is the only cost-effective action likely to result in a sufficiently low risk of extinction. To our knowledge we are the first study to use Bayesian metapopulation viability analysis to explicitly incorporate parametric and demographic uncertainty into a cost-effective evaluation of conservation actions. The approach offers guidance to decision makers aiming to achieve cost

  1. Chemical kinetic model uncertainty minimization through laminar flame speed measurements

    Park, Okjoo; Veloo, Peter S.; Sheen, David A.; Tao, Yujie; Egolfopoulos, Fokion N.; Wang, Hai

    2016-01-01

    Laminar flame speed measurements were carried for mixture of air with eight C3-4 hydrocarbons (propene, propane, 1,3-butadiene, 1-butene, 2-butene, iso-butene, n-butane, and iso-butane) at the room temperature and ambient pressure. Along with C1-2 hydrocarbon data reported in a recent study, the entire dataset was used to demonstrate how laminar flame speed data can be utilized to explore and minimize the uncertainties in a reaction model for foundation fuels. The USC Mech II kinetic model was chosen as a case study. The method of uncertainty minimization using polynomial chaos expansions (MUM-PCE) (D.A. Sheen and H. Wang, Combust. Flame 2011, 158, 2358–2374) was employed to constrain the model uncertainty for laminar flame speed predictions. Results demonstrate that a reaction model constrained only by the laminar flame speed values of methane/air flames notably reduces the uncertainty in the predictions of the laminar flame speeds of C3 and C4 alkanes, because the key chemical pathways of all of these flames are similar to each other. The uncertainty in model predictions for flames of unsaturated C3-4 hydrocarbons remain significant without considering fuel specific laminar flames speeds in the constraining target data set, because the secondary rate controlling reaction steps are different from those in the saturated alkanes. It is shown that the constraints provided by the laminar flame speeds of the foundation fuels could reduce notably the uncertainties in the predictions of laminar flame speeds of C4 alcohol/air mixtures. Furthermore, it is demonstrated that an accurate prediction of the laminar flame speed of a particular C4 alcohol/air mixture is better achieved through measurements for key molecular intermediates formed during the pyrolysis and oxidation of the parent fuel. PMID:27890938

  2. Uncertainty Relations and Possible Experience

    Gregg Jaeger

    2016-06-01

    Full Text Available The uncertainty principle can be understood as a condition of joint indeterminacy of classes of properties in quantum theory. The mathematical expressions most closely associated with this principle have been the uncertainty relations, various inequalities exemplified by the well known expression regarding position and momentum introduced by Heisenberg. Here, recent work involving a new sort of “logical” indeterminacy principle and associated relations introduced by Pitowsky, expressable directly in terms of probabilities of outcomes of measurements of sharp quantum observables, is reviewed and its quantum nature is discussed. These novel relations are derivable from Boolean “conditions of possible experience” of the quantum realm and have been considered both as fundamentally logical and as fundamentally geometrical. This work focuses on the relationship of indeterminacy to the propositions regarding the values of discrete, sharp observables of quantum systems. Here, reasons for favoring each of these two positions are considered. Finally, with an eye toward future research related to indeterminacy relations, further novel approaches grounded in category theory and intended to capture and reconceptualize the complementarity characteristics of quantum propositions are discussed in relation to the former.

  3. Inverse Problems and Uncertainty Quantification

    Litvinenko, Alexander

    2014-01-06

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  4. Uncertainties in the proton lifetime

    Ellis, J.; Nanopoulos, D.V.; Rudaz, S.; Gaillard, M.K.

    1980-04-01

    We discuss the masses of the leptoquark bosons m(x) and the proton lifetime in Grand Unified Theories based principally on SU(5). It is emphasized that estimates of m(x) based on the QCD coupling and the fine structure constant are probably more reliable than those using the experimental value of sin 2 theta(w). Uncertainties in the QCD Λ parameter and the correct value of α are discussed. We estimate higher order effects on the evolution of coupling constants in a momentum space renormalization scheme. It is shown that increasing the number of generations of fermions beyond the minimal three increases m(X) by almost a factor of 2 per generation. Additional uncertainties exist for each generation of technifermions that may exist. We discuss and discount the possibility that proton decay could be 'Cabibbo-rotated' away, and a speculation that Lorentz invariance may be violated in proton decay at a detectable level. We estimate that in the absence of any substantial new physics beyond that in the minimal SU(5) model the proton lifetimes is 8 x 10 30+-2 years

  5. Inverse Problems and Uncertainty Quantification

    Litvinenko, Alexander; Matthies, Hermann G.

    2014-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ) - the propagation of uncertainty through a computational (forward) modelare strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  6. Inverse problems and uncertainty quantification

    Litvinenko, Alexander

    2013-12-18

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)— the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. This is especially the case as together with a functional or spectral approach for the forward UQ there is no need for time- consuming and slowly convergent Monte Carlo sampling. The developed sampling- free non-linear Bayesian update is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisa- tion to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and quadratic Bayesian update on the small but taxing example of the chaotic Lorenz 84 model, where we experiment with the influence of different observation or measurement operators on the update.

  7. Needs of the CSAU uncertainty method

    Prosek, A.; Mavko, B.

    2000-01-01

    The use of best estimate codes for safety analysis requires quantification of the uncertainties. These uncertainties are inherently linked to the chosen safety analysis methodology. Worldwide, various methods were proposed for this quantification. The purpose of this paper was to identify the needs of the Code Scaling, Applicability, and Uncertainty (CSAU) methodology and then to answer the needs. The specific procedural steps were combined from other methods for uncertainty evaluation and new tools and procedures were proposed. The uncertainty analysis approach and tools were then utilized for confirmatory study. The uncertainty was quantified for the RELAP5/MOD3.2 thermalhydraulic computer code. The results of the adapted CSAU approach to the small-break loss-of-coolant accident (SB LOCA) show that the adapted CSAU can be used for any thermal-hydraulic safety analysis with uncertainty evaluation. However, it was indicated that there are still some limitations in the CSAU approach that need to be resolved. (author)

  8. Decision-Making under Criteria Uncertainty

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  9. Uncertainty in geological and hydrogeological data

    B. Nilsson

    2007-09-01

    Full Text Available Uncertainty in conceptual model structure and in environmental data is of essential interest when dealing with uncertainty in water resources management. To make quantification of uncertainty possible is it necessary to identify and characterise the uncertainty in geological and hydrogeological data. This paper discusses a range of available techniques to describe the uncertainty related to geological model structure and scale of support. Literature examples on uncertainty in hydrogeological variables such as saturated hydraulic conductivity, specific yield, specific storage, effective porosity and dispersivity are given. Field data usually have a spatial and temporal scale of support that is different from the one on which numerical models for water resources management operate. Uncertainty in hydrogeological data variables is characterised and assessed within the methodological framework of the HarmoniRiB classification.

  10. Uncertainties in Steric Sea Level Change Estimation During the Satellite Altimeter Era: Concepts and Practices

    MacIntosh, C. R.; Merchant, C. J.; von Schuckmann, K.

    2017-01-01

    This article presents a review of current practice in estimating steric sea level change, focussed on the treatment of uncertainty. Steric sea level change is the contribution to the change in sea level arising from the dependence of density on temperature and salinity. It is a significant component of sea level rise and a reflection of changing ocean heat content. However, tracking these steric changes still remains a significant challenge for the scientific community. We review the importance of understanding the uncertainty in estimates of steric sea level change. Relevant concepts of uncertainty are discussed and illustrated with the example of observational uncertainty propagation from a single profile of temperature and salinity measurements to steric height. We summarise and discuss the recent literature on methodologies and techniques used to estimate steric sea level in the context of the treatment of uncertainty. Our conclusions are that progress in quantifying steric sea level uncertainty will benefit from: greater clarity and transparency in published discussions of uncertainty, including exploitation of international standards for quantifying and expressing uncertainty in measurement; and the development of community "recipes" for quantifying the error covariances in observations and from sparse sampling and for estimating and propagating uncertainty across spatio-temporal scales.

  11. Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility

    Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.

    2010-01-01

    Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure

  12. Sources of uncertainty in individual monitoring for photographic,TL and OSL dosimetry techniques

    Ferreira, Max S.; Silva, Everton R.; Mauricio, Claudia L.P., E-mail: max.das.ferreira@gmail.com, E-mail: everton@ird.gov.br, E-mail: claudia@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The identification of the uncertainty sources and their quantification is essential to the quality of any dosimetric results. If uncertainties are not stated for all dose measurements informed in the monthly dose report to the monitored radiation facilities, they need to be known. This study aims to analyze the influence of different sources of uncertainties associated with photographic, TL and OSL dosimetric techniques, considering the evaluation of occupational doses of whole-body exposure for photons. To identify the sources of uncertainty it was conducted a bibliographic review in specific documents that deal with operational aspects of each technique and the uncertainties associated to each of them. Withal, technical visits to individual monitoring services were conducted to assist in this identification. The sources of uncertainty were categorized and their contributions were expressed in a qualitative way. The process of calibration and traceability are the most important sources of uncertainties, regardless the technique used. For photographic dosimetry, the remaining important uncertainty sources are due to: energy and angular dependence; linearity of response; variations in the films processing. For TL and OSL, the key process for a good performance is respectively the reproducibility of the thermal and optical cycles. For the three techniques, all procedures of the measurement process must be standardized, controlled and reproducible. Further studies can be performed to quantify the contribution of the sources of uncertainty. (author)

  13. Long-term health benefits of appetite suppressants remain unproven

    Francisco José Roma Paumgartten

    2011-12-01

    Full Text Available Because of the increasing prevalence of obesity, prevention and treatment of overweight has become a major public health concern. In addition to diet and exercise, drugs are needed for patients who failed to lose weight with behavioral treatment. The current article aimed to summarize recent concerns on the safety and efficacy of appetite suppressants. Several appetite suppressants have been banned for safety reasons. In 2010, sibutramine was withdrawn from the market because a long-term study showed it increased the risks of cardiovascular events. So far no study with a sufficiently large sample size has demonstrated that appetite suppressants can reduce morbidity and mortality associated with overweight. The withdrawal of sibutramine highlights that guidelines for the evaluation of weight control drugs must be more stringent, and studies on their long-term health benefits are needed prior to their marketing.

  14. Future Remains: Industrial Heritage at the Hanford Plutonium Works

    Freer, Brian

    This dissertation argues that U.S. environmental and historic preservation regulations, industrial heritage projects, history, and art only provide partial frameworks for successfully transmitting an informed story into the long range future about nuclear technology and its related environmental legacy. This argument is important because plutonium from nuclear weapons production is toxic to humans in very small amounts, threatens environmental health, has a half-life of 24, 110 years and because the industrial heritage project at Hanford is the first time an entire U.S. Department of Energy weapons production site has been designated a U.S. Historic District. This research is situated within anthropological interest in industrial heritage studies, environmental anthropology, applied visual anthropology, as well as wider discourses on nuclear studies. However, none of these disciplines is really designed or intended to be a completely satisfactory frame of reference for addressing this perplexing challenge of documenting and conveying an informed story about nuclear technology and its related environmental legacy into the long range future. Others have thought about this question and have made important contributions toward a potential solution. Examples here include: future generations movements concerning intergenerational equity as evidenced in scholarship, law, and amongst Native American groups; Nez Perce and Confederated Tribes of the Umatilla Indian Reservation responses to the Hanford End State Vision and Hanford's Canyon Disposition Initiative; as well as the findings of organizational scholars on the advantages realized by organizations that have a long term future perspective. While these ideas inform the main line inquiry of this dissertation, the principal approach put forth by the researcher of how to convey an informed story about nuclear technology and waste into the long range future is implementation of the proposed Future Remains clause, as

  15. Spot market activity remains weak as prices continue to fall

    Anon.

    1996-01-01

    A summary of financial data for the uranium spot market in November 1996 is provided. Price ranges for the restricted and unrestricted markets, conversion, and separative work are listed, and total market volume and new contracts are noted. Transactions made are briefly described. Deals made and pending in the spot concentrates, medium and long-term, conversion, and markets are listed for U.S. and non-U.S. buyers. Spot market activity increased in November with just over 1.0 million lbs of U3O8 equivalent being transacted compared to October's total of 530,000 lbs of U3O8 equivalent. The restricted uranium spot market price range slipped from $15.50-$15.70/lb U3O8 last month to $14.85/lb - $15.25/lb U3O8 this month. The unrestricted uranium spot market price range also slipped to $14.85/lb - $15.00/lb this month from $15.00/lb - $15.45/lb in October. Spot prices for conversion and separative work units remained at their October levels

  16. Briquettes of plant remains from the greenhouses of Almeria (Spain)

    Callejon-Ferre, A. J.; Lopez-Martinez, J. A.

    2009-07-01

    Since ancient times, plant biomass has been used as a primary fuel, and today, with the impending depletion of fossil fuels, these vegetal sources constitute a cleaner alternative and furthermore have a multitude of uses. The aim of the present study is to design a method of recycling and reuse of plant wastes from intensive agriculture under plastic, by manufacturing briquettes in an environmentally friendly manner. In Almeria (SE Spain), agriculture generates 769,500 t year{sup -}1 of plant remains from greenhouse-grown horticultural crops, a resource currently used for composting and for producing electricity.With the machinery and procedures of the present study, another potential use has been developed by detoxifying and eliminating the plastic wastes of the original biomass for the fabrication of briquettes for fireplaces. The results were slightly inferior to the commercial briquette from other non-horticultural plant materials (no forestry material), specifically 2512 kJ kg{sup -}1, in the least favourable case. On the contrary, the heating value with respect to the two charcoals was significantly lower, with a difference of 12,142 kJ kg{sup -}1. In conclusion; a procedure, applicable in ecological cultivation without agrochemicals or plastic cords, has been developed and tested to reuse and transform plant materials from intensive cultivation into a stable non-toxic product similar to composite logs, applicable in commercial settings or in residential fireplaces. (Author) 48 refs.

  17. Are the alleged remains of Johann Sebastian Bach authentic?

    Zegers, Richard H C; Maas, Mario; Koopman, A Ton G; Maat, George J R

    2009-02-16

    A skeleton alleged to be that of Johann Sebastian Bach (1685-1750) was exhumed from a graveyard in Leipzig, Germany, in 1894, but its authenticity is not established. In 1895, anatomist Wilhelm His concluded from his examination of the skeleton and reconstruction of the face that it most likely belonged to Bach. In 1949, surgeon Wolfgang Rosenthal noticed exostoses on the skeleton and on x-rays of 11 living organists and proposed a condition, Organistenkrankheit, which he interpreted as evidence that the skeleton was Bach's. However, our critical assessment of the remains analysis raises doubts: the localisation of the grave was dubious, and the methods used by His to reconstruct the face are controversial. Also, our study of the pelvic x-rays of 12 living professional organists failed to find evidence for the existence of Organistenkrankheit. We believe it is unlikely that the skeleton is that of Bach; techniques such as DNA analysis might help resolve the question but, to date, church authorities have not approved their use on the skeleton.

  18. Factors influencing home care nurse intention to remain employed.

    Tourangeau, Ann; Patterson, Erin; Rowe, Alissa; Saari, Margaret; Thomson, Heather; MacDonald, Geraldine; Cranley, Lisa; Squires, Mae

    2014-11-01

    To identify factors affecting Canadian home care nurse intention to remain employed (ITR). In developed nations, healthcare continues to shift into community settings. Although considerable research exists on examining nurse ITR in hospitals, similar research related to nurses employed in home care is limited. In the face of a global nursing shortage, it is important to understand the factors influencing nurse ITR across healthcare sectors. A qualitative exploratory descriptive design was used. Focus groups were conducted with home care nurses. Data were analysed using qualitative content analysis. Six categories of influencing factors were identified by home care nurses as affecting ITR: job characteristics; work structures; relationships/communication; work environment; nurse responses to work; and employment conditions. Findings suggest the following factors influence home care nurse ITR: having autonomy; flexible scheduling; reasonable and varied workloads; supportive work relationships; and receiving adequate pay and benefits. Home care nurses did not identify job satisfaction as a single concept influencing ITR. Home care nursing management should support nurse autonomy, allow flexible scheduling, promote reasonable workloads and create opportunities for team building that strengthen supportive relationships among home care nurses and other health team members. © 2013 John Wiley & Sons Ltd.

  19. Carnivoran remains from the Malapa hominin site, South Africa.

    Brian F Kuhn

    Full Text Available Recent discoveries at the new hominin-bearing deposits of Malapa, South Africa, have yielded a rich faunal assemblage associated with the newly described hominin taxon Australopithecus sediba. Dating of this deposit using U-Pb and palaeomagnetic methods has provided an age of 1.977 Ma, being one of the most accurately dated, time constrained deposits in the Plio-Pleistocene of southern Africa. To date, 81 carnivoran specimens have been identified at this site including members of the families Canidae, Viverridae, Herpestidae, Hyaenidae and Felidae. Of note is the presence of the extinct taxon Dinofelis cf. D. barlowi that may represent the last appearance date for this species. Extant large carnivores are represented by specimens of leopard (Panthera pardus and brown hyaena (Parahyaena brunnea. Smaller carnivores are also represented, and include the genera Atilax and Genetta, as well as Vulpes cf. V. chama. Malapa may also represent the first appearance date for Felis nigripes (Black-footed cat. The geochronological age of Malapa and the associated hominin taxa and carnivoran remains provide a window of research into mammalian evolution during a relatively unknown period in South Africa and elsewhere. In particular, the fauna represented at Malapa has the potential to elucidate aspects of the evolution of Dinofelis and may help resolve competing hypotheses about faunal exchange between East and Southern Africa during the late Pliocene or early Pleistocene.

  20. DNA Profiling Success Rates from Degraded Skeletal Remains in Guatemala.

    Johnston, Emma; Stephenson, Mishel

    2016-07-01

    No data are available regarding the success of DNA Short Tandem Repeat (STR) profiling from degraded skeletal remains in Guatemala. Therefore, DNA profiling success rates relating to 2595 skeletons from eleven cases at the Forensic Anthropology Foundation of Guatemala (FAFG) are presented. The typical postmortem interval was 30 years. DNA was extracted from bone powder and amplified using Identifiler and Minifler. DNA profiling success rates differed between cases, ranging from 50.8% to 7.0%, the overall success rate for samples was 36.3%. The best DNA profiling success rates were obtained from femur (36.2%) and tooth (33.7%) samples. DNA profiles were significantly better from lower body bones than upper body bones (p = <0.0001). Bone samples from males gave significantly better profiles than samples from females (p = <0.0001). These results are believed to be related to bone density. The findings are important for designing forensic DNA sampling strategies in future victim recovery investigations. © 2016 American Academy of Forensic Sciences.

  1. Using contractors to decommission while remaining as licensee

    Rankine, A.

    1997-01-01

    Over the last few years the role of the United Kingdom Atomic Energy Authority (UKAEA) has changed from one involved in research and development in the field of nuclear power and associated technology, to one of managing the liabilities left over from its previous mission. This period has also seen two significant portions of the organization move to the private sector with sale of the Facilities Services Division to PROCORD and the privatization of AEA Technology. The new UKAEA is therefore a focused liabilities management organization, making the best use of expertise in the private sector in carrying out its mission, but retaining adequate internal resource and expertise to fulful its role and responsibilities as the licensee. UKAEA continues to be committed to giving the highest priority to meeting high standards of safety and environmental protection required of the holder of the Nuclear Site Licence under the Nuclear Installations Act. This paper describes the safety management system within the UKAEA which ensures that UKAEA remains the proper and effective licensee and gives some examples of how this has worked in practice. (author)

  2. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  3. Incorporating model parameter uncertainty into inverse treatment planning

    Lian Jun; Xing Lei

    2004-01-01

    Radiobiological treatment planning depends not only on the accuracy of the models describing the dose-response relation of different tumors and normal tissues but also on the accuracy of tissue specific radiobiological parameters in these models. Whereas the general formalism remains the same, different sets of model parameters lead to different solutions and thus critically determine the final plan. Here we describe an inverse planning formalism with inclusion of model parameter uncertainties. This is made possible by using a statistical analysis-based frameset developed by our group. In this formalism, the uncertainties of model parameters, such as the parameter a that describes tissue-specific effect in the equivalent uniform dose (EUD) model, are expressed by probability density function and are included in the dose optimization process. We found that the final solution strongly depends on distribution functions of the model parameters. Considering that currently available models for computing biological effects of radiation are simplistic, and the clinical data used to derive the models are sparse and of questionable quality, the proposed technique provides us with an effective tool to minimize the effect caused by the uncertainties in a statistical sense. With the incorporation of the uncertainties, the technique has potential for us to maximally utilize the available radiobiology knowledge for better IMRT treatment

  4. Can agent based models effectively reduce fisheries management implementation uncertainty?

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  5. Diagnostic uncertainty and recall bias in chronic low back pain.

    Serbic, Danijela; Pincus, Tamar

    2014-08-01

    Patients' beliefs about the origin of their pain and their cognitive processing of pain-related information have both been shown to be associated with poorer prognosis in low back pain (LBP), but the relationship between specific beliefs and specific cognitive processes is not known. The aim of this study was to examine the relationship between diagnostic uncertainty and recall bias in 2 groups of chronic LBP patients, those who were certain about their diagnosis and those who believed that their pain was due to an undiagnosed problem. Patients (N=68) endorsed and subsequently recalled pain, illness, depression, and neutral stimuli. They also provided measures of pain, diagnostic status, mood, and disability. Both groups exhibited a recall bias for pain stimuli, but only the group with diagnostic uncertainty also displayed a recall bias for illness-related stimuli. This bias remained after controlling for depression and disability. Sensitivity analyses using grouping by diagnosis/explanation received supported these findings. Higher levels of depression and disability were found in the group with diagnostic uncertainty, but levels of pain intensity did not differ between the groups. Although the methodology does not provide information on causality, the results provide evidence for a relationship between diagnostic uncertainty and recall bias for negative health-related stimuli in chronic LBP patients. Copyright © 2014 International Association for the Study of Pain. Published by Elsevier B.V. All rights reserved.

  6. Sensitivity of direct global warming potentials to key uncertainties

    Wuebbles, D.J.; Patten, K.O.; Grant, K.E.; Jain, A.K.

    1992-07-01

    A series of sensitivity studies examines the effect of several uncertainties in Global Wanning Potentials (GWPs). For example, the original evaluation of GWPs for the Intergovernmental Panel on Climate Change (EPCC, 1990) did not attempt to account for the possible sinks of carbon dioxide (CO 2 ) that could balance the carbon cycle and produce atmospheric concentrations of C0 2 that match observations. In this study, a balanced carbon cycle model is applied in calculation of the radiative forcing from C0 2 . Use of the balanced model produces up to 20 percent enhancement of the GWPs for most trace gases compared with the EPCC (1990) values for time horizons up to 100 years, but a decreasing enhancement with longer time horizons. Uncertainty limits of the fertilization feedback parameter contribute a 10 percent range in GWP values. Another systematic uncertainty in GWPs is the assumption of an equilibrium atmosphere (one in which the concentration of trace gases remains constant) versus a disequilibrium atmosphere. The latter gives GWPs that are 15 to 30 percent greater than the former, dependening upon the carbon dioxide emission scenario chosen. Seven scenarios are employed: constant emission past 1990 and the six EPCC (1992) emission scenarios. For the analysis of uncertainties in atmospheric lifetime (τ), the GWP changes in direct proportion to τ for short-lived gases, but to a lesser extent for gases with τ greater than the time horizon for the GWP calculation

  7. Axial power monitoring uncertainty in the Savannah River Reactors

    Losey, D.C.; Revolinski, S.M.

    1990-01-01

    The results of this analysis quantified the uncertainty associated with monitoring the Axial Power Shape (APS) in the Savannah River Reactors. Thermocouples at each assembly flow exit map the radial power distribution and are the primary means of monitoring power in these reactors. The remaining uncertainty in power monitoring is associated with the relative axial power distribution. The APS is monitored by seven sensors that respond to power on each of nine vertical Axial Power Monitor (APM) rods. Computation of the APS uncertainty, for the reactor power limits analysis, started with a large database of APM rod measurements spanning several years of reactor operation. A computer algorithm was used to randomly select a sample of APSs which were input to a code. This code modeled the thermal-hydraulic performance of a single fuel assembly during a design basis Loss-of Coolant Accident. The assembly power limit at Onset of Significant Voiding was computed for each APS. The output was a distribution of expected assembly power limits that was adjusted to account for the biases caused by instrumentation error and by measuring 7 points rather than a continuous APS. Statistical analysis of the final assembly power limit distribution showed that reducing reactor power by approximately 3% was sufficient to account for APS variation. This data confirmed expectations that the assembly exit thermocouples provide all information needed for monitoring core power. The computational analysis results also quantified the contribution to power limits of the various uncertainties such as instrumentation error

  8. Roughness coefficient and its uncertainty in gravel-bed river

    Ji-Sung Kim

    2010-06-01

    Full Text Available Manning's roughness coefficient was estimated for a gravel-bed river reach using field measurements of water level and discharge, and the applicability of various methods used for estimation of the roughness coefficient was evaluated. Results show that the roughness coefficient tends to decrease with increasing discharge and water depth, and over a certain range it appears to remain constant. Comparison of roughness coefficients calculated by field measurement data with those estimated by other methods shows that, although the field-measured values provide approximate roughness coefficients for relatively large discharge, there seems to be rather high uncertainty due to the difference in resultant values. For this reason, uncertainty related to the roughness coefficient was analyzed in terms of change in computed variables. On average, a 20% increase of the roughness coefficient causes a 7% increase in the water depth and an 8% decrease in velocity, but there may be about a 15% increase in the water depth and an equivalent decrease in velocity for certain cross-sections in the study reach. Finally, the validity of estimated roughness coefficient based on field measurements was examined. A 10% error in discharge measurement may lead to more than 10% uncertainty in roughness coefficient estimation, but corresponding uncertainty in computed water depth and velocity is reduced to approximately 5%. Conversely, the necessity for roughness coefficient estimation by field measurement is confirmed.

  9. Reliability ensemble averaging of 21st century projections of terrestrial net primary productivity reduces global and regional uncertainties

    Exbrayat, Jean-François; Bloom, A. Anthony; Falloon, Pete; Ito, Akihiko; Smallman, T. Luke; Williams, Mathew

    2018-02-01

    Multi-model averaging techniques provide opportunities to extract additional information from large ensembles of simulations. In particular, present-day model skill can be used to evaluate their potential performance in future climate simulations. Multi-model averaging methods have been used extensively in climate and hydrological sciences, but they have not been used to constrain projected plant productivity responses to climate change, which is a major uncertainty in Earth system modelling. Here, we use three global observationally orientated estimates of current net primary productivity (NPP) to perform a reliability ensemble averaging (REA) method using 30 global simulations of the 21st century change in NPP based on the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) business as usual emissions scenario. We find that the three REA methods support an increase in global NPP by the end of the 21st century (2095-2099) compared to 2001-2005, which is 2-3 % stronger than the ensemble ISIMIP mean value of 24.2 Pg C y-1. Using REA also leads to a 45-68 % reduction in the global uncertainty of 21st century NPP projection, which strengthens confidence in the resilience of the CO2 fertilization effect to climate change. This reduction in uncertainty is especially clear for boreal ecosystems although it may be an artefact due to the lack of representation of nutrient limitations on NPP in most models. Conversely, the large uncertainty that remains on the sign of the response of NPP in semi-arid regions points to the need for better observations and model development in these regions.

  10. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  11. Surgical treatment of dislocated acromioclavicular syndesmolysis remains controversial

    Slaviša Mihaljevič

    2007-12-01

    result was 6.7. 39 % of the injured persons were on the sick leave for more than 4 months. In the injured persons with accident insurance we noticed a longer sick leave status (odds ratio 1.25. The injury did not affect the injured persons employment. The majority of the employed (92.4 % carried out the same work after treatment as before the injury.Conclusion: By the operative treatment of the AC joint dislocation a good or excellent result is achieved in the majority of the injured persons (83 % in total.

  12. Addressing uncertainties in the ERICA Integrated Approach

    Oughton, D.H.; Agueero, A.; Avila, R.; Brown, J.E.; Copplestone, D.; Gilek, M.

    2008-01-01

    Like any complex environmental problem, ecological risk assessment of the impacts of ionising radiation is confounded by uncertainty. At all stages, from problem formulation through to risk characterisation, the assessment is dependent on models, scenarios, assumptions and extrapolations. These include technical uncertainties related to the data used, conceptual uncertainties associated with models and scenarios, as well as social uncertainties such as economic impacts, the interpretation of legislation, and the acceptability of the assessment results to stakeholders. The ERICA Integrated Approach has been developed to allow an assessment of the risks of ionising radiation, and includes a number of methods that are intended to make the uncertainties and assumptions inherent in the assessment more transparent to users and stakeholders. Throughout its development, ERICA has recommended that assessors deal openly with the deeper dimensions of uncertainty and acknowledge that uncertainty is intrinsic to complex systems. Since the tool is based on a tiered approach, the approaches to dealing with uncertainty vary between the tiers, ranging from a simple, but highly conservative screening to a full probabilistic risk assessment including sensitivity analysis. This paper gives on overview of types of uncertainty that are manifest in ecological risk assessment and the ERICA Integrated Approach to dealing with some of these uncertainties

  13. Reusable launch vehicle model uncertainties impact analysis

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  14. Image restoration, uncertainty, and information.

    Yu, F T

    1969-01-01

    Some of the physical interpretations about image restoration are discussed. From the theory of information the unrealizability of an inverse filter can be explained by degradation of information, which is due to distortion on the recorded image. The image restoration is a time and space problem, which can be recognized from the theory of relativity (the problem of image restoration is related to Heisenberg's uncertainty principle in quantum mechanics). A detailed discussion of the relationship between information and energy is given. Two general results may be stated: (1) the restoration of the image from the distorted signal is possible only if it satisfies the detectability condition. However, the restored image, at the best, can only approach to the maximum allowable time criterion. (2) The restoration of an image by superimposing the distorted signal (due to smearing) is a physically unrealizable method. However, this restoration procedure may be achieved by the expenditure of an infinite amount of energy.

  15. Modelling of Transport Projects Uncertainties

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  16. Mineralized remains of morphotypes of filamentous cyanobacteria in carbonaceous meteorites

    Hoover, Richard B.

    2005-09-01

    rocks, living, cryopreserved and fossilized extremophiles and cyanobacteria. These studies have resulted in the detection of mineralized remains of morphotypes of filamentous cyanobacteria, mats and consortia in many carbonaceous meteorites. These well-preserved and embedded microfossils are consistent with the size, morphology and ultra-microstructure of filamentous trichomic prokaryotes and degraded remains of microfibrils of cyanobacterial sheaths. EDAX elemental studies reveal that the forms in the meteorites often have highly carbonized sheaths in close association with permineralized filaments, trichomes, and microbial cells. The eextensive protocols and methodologies that have been developed to protect the samples from contamination and to distinguish recent contaminants from indigenous microfossils are described recent bio-contaminants. Ratios of critical bioelements (C:O, C:N, C:P, and C:S) reveal dramatic differences between microfossils in Earth rocks and meteorites and in the cells, filaments, trichomes, and hormogonia of recently living cyanobacteria. The results of comparative optical, ESEM and FESEM studies and EDAX elemental analyses of recent cyanobacteria (e.g. Calothrix, Oscillatoria, and Lyngbya) of similar size, morphology and microstructure to microfossils found embedded in the Murchison CM2 and the Orgueil CI1 carbonaceous meteorites are presented

  17. Remaining lifetime modeling using State-of-Health estimation

    Beganovic, Nejra; Söffker, Dirk

    2017-08-01

    Technical systems and system's components undergo gradual degradation over time. Continuous degradation occurred in system is reflected in decreased system's reliability and unavoidably lead to a system failure. Therefore, continuous evaluation of State-of-Health (SoH) is inevitable to provide at least predefined lifetime of the system defined by manufacturer, or even better, to extend the lifetime given by manufacturer. However, precondition for lifetime extension is accurate estimation of SoH as well as the estimation and prediction of Remaining Useful Lifetime (RUL). For this purpose, lifetime models describing the relation between system/component degradation and consumed lifetime have to be established. In this contribution modeling and selection of suitable lifetime models from database based on current SoH conditions are discussed. Main contribution of this paper is the development of new modeling strategies capable to describe complex relations between measurable system variables, related system degradation, and RUL. Two approaches with accompanying advantages and disadvantages are introduced and compared. Both approaches are capable to model stochastic aging processes of a system by simultaneous adaption of RUL models to current SoH. The first approach requires a priori knowledge about aging processes in the system and accurate estimation of SoH. An estimation of SoH here is conditioned by tracking actual accumulated damage into the system, so that particular model parameters are defined according to a priori known assumptions about system's aging. Prediction accuracy in this case is highly dependent on accurate estimation of SoH but includes high number of degrees of freedom. The second approach in this contribution does not require a priori knowledge about system's aging as particular model parameters are defined in accordance to multi-objective optimization procedure. Prediction accuracy of this model does not highly depend on estimated SoH. This model

  18. Clarifying some remaining questions in the anomaly puzzle

    Huang, Xing; Parker, Leonard

    2011-01-01

    We discuss several points that may help to clarify some questions that remain about the anomaly puzzle in supersymmetric theories. In particular, we consider a general N=1 supersymmetric Yang-Mills theory. The anomaly puzzle concerns the question of whether there is a consistent way in the quantized theory to put the R-current and the stress tensor in a single supermultiplet called the supercurrent, even though in the classical theory they are in the same supermultiplet. It was proposed that the classically conserved supercurrent bifurcates into two supercurrents having different anomalies in the quantum regime. The most interesting result we obtain is an explicit expression for the lowest component of one of the two supercurrents in 4-dimensional spacetime, namely the supercurrent that has the energy-momentum tensor as one of its components. This expression for the lowest component is an energy-dependent linear combination of two chiral currents, which itself does not correspond to a classically conserved chiral current. The lowest component of the other supercurrent, namely, the R-current, satisfies the Adler-Bardeen theorem. The lowest component of the first supercurrent has an anomaly, which we show is consistent with the anomaly of the trace of the energy-momentum tensor. Therefore, we conclude that there is no consistent way to construct a single supercurrent multiplet that contains the R-current and the stress tensor in the straightforward way originally proposed. We also discuss and try to clarify some technical points in the derivations of the two supercurrents in the literature. These latter points concern the significance of infrared contributions to the NSVZ β-function and the role of the equations of motion in deriving the two supercurrents. (orig.)

  19. Will southern California remain a premium market for natural gas?

    John, F.E.

    1991-01-01

    Average yearly demand for natural gas in southern California totalled just over 3 billion ft 3 /d in 1991 and is projected to increase to just over 3.2 billion ft 3 /d in 2000 and 3.4 billion ft 3 /d in 2010. In the core residential market, demand is being driven by population growth and offset by conservation measures. In the core commercial and industrial market, demand is driven by employment growth and offset by conservation. In the noncore market, natural gas use is expected to fall from 262 million ft 3 /d in 1991 to 223 million ft 3 /d in 2010. Demand for natural gas for cogeneration is expected to either remain stagnant or decrease. The largest potential for market growth in southern California is for utility electric generation. Demand in this sector is expected to increase from 468 million ft 3 /d in 1991 to 1 billion ft 3 in 2010. Air quality concerns furnish a market opportunity for natural gas vehicles, and a substantial increase in natural gas demand might be obtained from even a modest market share of the region's 10 million vehicles. Existing pipeline capacity is sufficient to supply current average year requirements, and the need for new capacity hinges on the issues of satisfying high-year demand, meeting market growth, and accessing more desirable supply regions. Planned capacity additions of 2,150 million ft 3 /d, if completed, will bring substantial excess capacity to southern California in the late 1990s. The competitive advantages of various producing regions will then be greatly influenced by the rate designs used on the pipelines connecting them to the market. 4 tabs

  20. Neutron activation analysis of the prehistoric and ancient bone remains

    Vasidov, A.; Osinskaya, N.S.; Khatamov, Sh.; Rakhmanova, T.; Akhmadshaev, A.Sh.

    2006-01-01

    Full text: In the work results of the instrumental neutron activation analysis (INAA) of prehistoric bone remains of dinosaurs and ancient bones of bear, archantrop found out on the territory of Uzbekistan are presents. A bone of dinosaur from Mongolia, standard a bone of the person and soils taken from a surface and from of the femoral joint of a dinosaur were also subject to INAA. The INAA method determines of contents of about 30 elements in bones and soils in an interval 0.043-3600 mg / kg. Among found elements Ca (46 %), Sc, Cr, Fe (up to 2.2 g/kg), Ni, Zn, Sr (up to 3.6 g/kg), Sb, Ba, Sb and some others are mainly found in bones. The contents of some elements in bones of dinosaurs reach very high values 280-3200 mg / kg, and are mainly lanthanides La, Ce, Nd, Sm, Eu, Tb, Yb and Lu. In our opinion, lanthanides and some other elements, like As, Br, and Mo in bones were formed as a result of fission of uranium and transuranium elements. Because content of uranium in bones of dinosaurs is very high, up to 180 mg / kg, and those of thorium is 20 mg/ kg. However U and Th in soils are 4.8 mg/kg and 3.7 mg / kg, respectively. The content of uranium in bones of the archantrop is 1.53 mg / kg, while U in standard bone of the human is less than 0,016 mg/kg. (author)

  1. The broad spectrum revisited: evidence from plant remains.

    Weiss, Ehud; Wetterstrom, Wilma; Nadel, Dani; Bar-Yosef, Ofer

    2004-06-29

    The beginning of agriculture is one of the most important developments in human history, with enormous consequences that paved the way for settled life and complex society. Much of the research on the origins of agriculture over the last 40 years has been guided by Flannery's [Flannery, K. V. (1969) in The Domestication and Exploitation of Plants and Animals, eds. Ucko, P. J. & Dimbleby, G. W. (Duckworth, London), pp. 73-100] "broad spectrum revolution" (BSR) hypothesis, which posits that the transition to farming in southwest Asia entailed a period during which foragers broadened their resource base to encompass a wide array of foods that were previously ignored in an attempt to overcome food shortages. Although these resources undoubtedly included plants, nearly all BSR hypothesis-inspired research has focused on animals because of a dearth of Upper Paleolithic archaeobotanical assemblages. Now, however, a collection of >90,000 plant remains, recently recovered from the Stone Age site Ohalo II (23,000 B.P.), Israel, offers insights into the plant foods of the late Upper Paleolithic. The staple foods of this assemblage were wild grasses, pushing back the dietary shift to grains some 10,000 years earlier than previously recognized. Besides the cereals (wild wheat and barley), small-grained grasses made up a large component of the assemblage, indicating that the BSR in the Levant was even broader than originally conceived, encompassing what would have been low-ranked plant foods. Over the next 15,000 years small-grained grasses were gradually replaced by the cereals and ultimately disappeared from the Levantine diet.

  2. Attitudes, beliefs, uncertainty and risk

    Greenhalgh, Geoffrey [Down Park Place, Crawley Down (United Kingdom)

    2001-07-01

    There is now unmistakable evidence of a widening split within the Western industrial nations arising from conflicting views of society; for and against change. The argument is over the benefits of 'progress' and growth. On one side are those who seek more jobs, more production and consumption, higher standards of living, an ever-increasing GNP with an increasing globalisation of production and welcome the advances of science and technology confident that any temporary problems that arise can be solved by further technological development - possible energy shortages as a growing population increases energy usage can be met by nuclear power development; food shortages by the increased yields of GM crops. In opposition are those who put the quality of life before GNP, advocate a more frugal life-style, reducing needs and energy consumption, and, pointing to the harm caused by increasing pollution, press for cleaner air and water standards. They seek to reduce the pressure of an ever-increasing population and above all to preserve the natural environment. This view is associated with a growing uncertainty as the established order is challenged with the rise in status of 'alternative' science and medicine. This paper argues that these conflicting views reflect instinctive attitudes. These in turn draw support from beliefs selected from those which uncertainty offers. Where there is scope for argument over the truth or validity of a 'fact', the choice of which of the disputed views to believe will be determined by a value judgement. This applies to all controversial social and political issues. Nuclear waste disposal and biotechnology are but two particular examples in the technological field; joining the EMU is a current political controversy where value judgements based on attitudes determine beliefs. When, or if, a controversy is finally resolved the judgement arrived at will be justified by the belief that the consequences of the course chosen will be more favourable

  3. Attitudes, beliefs, uncertainty and risk

    Greenhalgh, Geoffrey [Down Park Place, Crawley Down (United Kingdom)

    2001-07-01

    There is now unmistakable evidence of a widening split within the Western industrial nations arising from conflicting views of society; for and against change. The argument is over the benefits of 'progress' and growth. On one side are those who seek more jobs, more production and consumption, higher standards of living, an ever-increasing GNP with an increasing globalisation of production and welcome the advances of science and technology confident that any temporary problems that arise can be solved by further technological development - possible energy shortages as a growing population increases energy usage can be met by nuclear power development; food shortages by the increased yields of GM crops. In opposition are those who put the quality of life before GNP, advocate a more frugal life-style, reducing needs and energy consumption, and, pointing to the harm caused by increasing pollution, press for cleaner air and water standards. They seek to reduce the pressure of an ever-increasing population and above all to preserve the natural environment. This view is associated with a growing uncertainty as the established order is challenged with the rise in status of 'alternative' science and medicine. This paper argues that these conflicting views reflect instinctive attitudes. These in turn draw support from beliefs selected from those which uncertainty offers. Where there is scope for argument over the truth or validity of a 'fact', the choice of which of the disputed views to believe will be determined by a value judgement. This applies to all controversial social and political issues. Nuclear waste disposal and biotechnology are but two particular examples in the technological field; joining the EMU is a current political controversy where value judgements based on attitudes determine beliefs. When, or if, a controversy is finally resolved the judgement arrived at will be justified by the belief that the consequences of the course

  4. Attitudes, beliefs, uncertainty and risk

    Greenhalgh, Geoffrey

    2001-01-01

    There is now unmistakable evidence of a widening split within the Western industrial nations arising from conflicting views of society; for and against change. The argument is over the benefits of 'progress' and growth. On one side are those who seek more jobs, more production and consumption, higher standards of living, an ever-increasing GNP with an increasing globalisation of production and welcome the advances of science and technology confident that any temporary problems that arise can be solved by further technological development - possible energy shortages as a growing population increases energy usage can be met by nuclear power development; food shortages by the increased yields of GM crops. In opposition are those who put the quality of life before GNP, advocate a more frugal life-style, reducing needs and energy consumption, and, pointing to the harm caused by increasing pollution, press for cleaner air and water standards. They seek to reduce the pressure of an ever-increasing population and above all to preserve the natural environment. This view is associated with a growing uncertainty as the established order is challenged with the rise in status of 'alternative' science and medicine. This paper argues that these conflicting views reflect instinctive attitudes. These in turn draw support from beliefs selected from those which uncertainty offers. Where there is scope for argument over the truth or validity of a 'fact', the choice of which of the disputed views to believe will be determined by a value judgement. This applies to all controversial social and political issues. Nuclear waste disposal and biotechnology are but two particular examples in the technological field; joining the EMU is a current political controversy where value judgements based on attitudes determine beliefs. When, or if, a controversy is finally resolved the judgement arrived at will be justified by the belief that the consequences of the course chosen will be more favourable

  5. Thermal effects of condensing water have remained local

    Ilus, E.

    1997-01-01

    General eutrophication of the Gulf of Finland has played a major role in the biological changes that have taken place in the sea area off Loviisa nuclear power plant. The quantities of plant nutrients in the water are now 1.5 to 2 times greater than 20 years ago. Changes attributable to the thermal effects of the power plant's cooling waters have been relatively small, and they have been restricted to the immediate surroundings of the discharge area. The most distinct environmental effects have been discovered in the temperatures of sea water, in ice conditions and in water currents within the discharge area of cooling water. The most visible biological change that has a direct link to the thermal load resulting from the power plant is the more abundant aquatic flora near the discharge point of cooling water on the southwestern shores of the Haestholmsfjaerden. Similar growth of aquatic flora has also been discovered near the discharge outlet of Olkiluoto plant, although the nutrient contents of water there are only half of the values measured in the Loviisa area. Regular radiation monitoring of the areas surrounding the nuclear power plants began before the start up of the plants. The contents of radioactive substances discovered have been small and in agreement with the release data given by the power companies. (orig.)

  6. Uncertainty analysis of time-dependent nonlinear systems: theory and application to transient thermal hydraulics

    Barhen, J.; Bjerke, M.A.; Cacuci, D.G.; Mullins, C.B.; Wagschal, G.G.

    1982-01-01

    An advanced methodology for performing systematic uncertainty analysis of time-dependent nonlinear systems is presented. This methodology includes a capability for reducing uncertainties in system parameters and responses by using Bayesian inference techniques to consistently combine prior knowledge with additional experimental information. The determination of best estimates for the system parameters, for the responses, and for their respective covariances is treated as a time-dependent constrained minimization problem. Three alternative formalisms for solving this problem are developed. The two ''off-line'' formalisms, with and without ''foresight'' characteristics, require the generation of a complete sensitivity data base prior to performing the uncertainty analysis. The ''online'' formalism, in which uncertainty analysis is performed interactively with the system analysis code, is best suited for treatment of large-scale highly nonlinear time-dependent problems. This methodology is applied to the uncertainty analysis of a transient upflow of a high pressure water heat transfer experiment. For comparison, an uncertainty analysis using sensitivities computed by standard response surface techniques is also performed. The results of the analysis indicate the following. Major reduction of the discrepancies in the calculation/experiment ratios is achieved by using the new methodology. Incorporation of in-bundle measurements in the uncertainty analysis significantly reduces system uncertainties. Accuracy of sensitivities generated by response-surface techniques should be carefully assessed prior to using them as a basis for uncertainty analyses of transient reactor safety problems

  7. Uncertainty analysis in estimating Japanese ingestion of global fallout Cs-137 using health risk evaluation model

    Shimada, Yoko; Morisawa, Shinsuke

    1998-01-01

    Most of model estimation of the environmental contamination includes some uncertainty associated with the parameter uncertainty in the model. In this study, the uncertainty was analyzed in a model for evaluating the ingestion of radionuclide caused by the long-term global low-level radioactive contamination by using various uncertainty analysis methods: the percentile estimate, the robustness analysis and the fuzzy estimate. The model is mainly composed of five sub-models, which include their own uncertainty; we also analyzed the uncertainty. The major findings obtained in this study include that the possibility of the discrepancy between predicted value by the model simulation and the observed data is less than 10%; the uncertainty of the predicted value is higher before 1950 and after 1980; the uncertainty of the predicted value can be reduced by decreasing the uncertainty of some environmental parameters in the model; the reliability of the model can definitively depend on the following environmental factors: direct foliar absorption coefficient, transfer factor of radionuclide from stratosphere down to troposphere, residual rate by food processing and cooking, transfer factor of radionuclide in ocean and sedimentation in ocean. (author)

  8. The use of fish remains in sediments for the reconstruction of paleoproductivity

    Drago, T; Santos, A M P; Pinheiro, J [Institute Nacional de Recursos Biologicos (INRB), L-IPIMAR, Av. 5 de Outubro s/n 8700-305 OLHaO (Portugal); Ferreira-Bartrina, V [Centra de Investigacion CientIfica y de Educacion Superior de Ensenada- CICESE, Km. 107 Carretera Tijuana, C.P.22860, Ensenada, B.C. (Mexico)], E-mail: tdrago@ipimar.pt

    2009-01-01

    The majority of the works concerning fish productivity are based in fish landing records. However, in order to understand the causes of variability in fish productivity (natural and/or anthropogenic) it is essential to have information from periods when human impacts (e.g., fisheries) are considered unimportant. This can be achieved through the use of fish remains, i.e. scales, vertebrae and otoliths, from sediment records. The obtained data can be used to develop time series of fish stocks revealing the history of fish population dynamics over the last centuries or millennia. The majority of these works are located in Eastern Boundary Current Systems (e.g., Benguela, Peru-Humboldt, California), because these are associated with coastal upwelling and high productivity, which in some cases is at the origin of low bottom oxygen levels, leading to scale preservation. A search for fish remains in the Portuguese margin sediments is in progress in the context of the ongoing research project POPEI (High-resolution oceanic paleoproductivity and environmental changes; correlation with fish populations), which intend to fill the gap in studies of this type for the Canary Current System. In this paper we review some general ideas of the use of fish remains, related studies, methodologies and data processing, as well as presenting the first results of POPEI.

  9. Educating Amid Uncertainty: The Organizational Supports Teachers Need to Serve Students in High-Poverty, Urban Schools

    Kraft, Matthew A.; Papay, John P.; Johnson, Susan Moore; Charner-Laird, Megin; Ng, Monica; Reinhorn, Stefanie

    2015-01-01

    Purpose: We examine how uncertainty, both about students and the context in which they are taught, remains a persistent condition of teachers' work in high-poverty, urban schools. We describe six schools' organizational responses to these uncertainties, analyze how these responses reflect open- versus closed-system approaches, and examine how this…

  10. Prospects after Major Trauma

    Holtslag, H.R.

    2007-01-01

    Introduction. After patients survived major trauma, their prospects, in terms of the consequences for functioning, are uncertain, which may impact severely on patient, family and society. The studies in this thesis describes the long-term outcomes of severe injured patients after major trauma. In

  11. Quantification of the impact of precipitation spatial distribution uncertainty on predictive uncertainty of a snowmelt runoff model

    Jacquin, A. P.

    2012-04-01

    This study is intended to quantify the impact of uncertainty about precipitation spatial distribution on predictive uncertainty of a snowmelt runoff model. This problem is especially relevant in mountain catchments with a sparse precipitation observation network and relative short precipitation records. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment's glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation at a station and a precipitation factor FPi. If other precipitation data are not available, these precipitation factors must be adjusted during the calibration process and are thus seen as parameters of the model. In the case of the fifth zone, glaciers are seen as an inexhaustible source of water that melts when the snow cover is depleted.The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. The model's predictive uncertainty is measured in terms of the output variance of the mean squared error of the Box-Cox transformed discharge, the relative volumetric error, and the weighted average of snow water equivalent in the elevation zones at the end of the simulation period. Sobol's variance decomposition (SVD) method is used for assessing the impact of precipitation spatial distribution, represented by the precipitation factors FPi, on the models' predictive uncertainty. In the SVD method, the first order effect of a parameter (or group of parameters) indicates the fraction of predictive uncertainty that could be reduced if the true value of this parameter (or group) was known. Similarly, the total effect of a parameter (or group) measures the fraction of predictive uncertainty that would remain if the true value of this parameter (or group) was unknown, but all the remaining model parameters could be fixed

  12. A review of the uncertainties in the assessment of radiological consequences of spent nuclear fuel disposal

    Wiborgh, M.; Elert, M.; Hoeglund, L.O.; Jones, C.; Grundfelt, B.; Skagius, K.; Bengtsson, A.

    1992-06-01

    Radioactive waste disposal systems for spent nuclear fuel are designed to isolate the radioactive waste from the human environment for long period of time. The isolation is provided by a combination of engineered and natural barriers. Safety assessments are performed to describe and quantify the performance of the individual barriers and the disposal system over long-term periods. These assessments will always be associated with uncertainties. Uncertainties can originate from the variability of natural systems and will also be introduced in the predictive modelling performed to quantitatively evaluate the behaviour of the disposal system as a consequence of the incomplete knowledge about the governing processes. Uncertainties in safety assessments can partly be reduced by additional measurements and research. The aim of this study has been to identify uncertainties in assessments of radiological consequences from the disposal of spent nuclear fuel based on the Swedish KBS-3 concept. The identified uncertainties have been classified with respect to their origin, i.e. in conceptual, modelling and data uncertainties. The possibilities to reduce the uncertainties are also commented upon. In assessments it is important to decrease uncertainties which are of major importance for the performance of the disposal system. These could to some extent be identified by uncertainty analysis. However, conceptual uncertainties and some type of model uncertainties are difficult to evaluate. To be able to decrease uncertainties in conceptual models, it is essential that the processes describing and influencing the radionuclide transport in the engineered and natural barriers are sufficiently understood. In this study a qualitative approach has been used. The importance of different barriers and processes are indicated by their influence on the release of some representative radionuclides. (122 refs.) (au)

  13. The Right to Remain Silent in Criminal Trial

    Gianina Anemona Radu

    2013-05-01

    Full Text Available A person's right not to incriminate oneself or to remain silent and not contribute to their own incrimination is a basic requirement of due process, although the right not to testify against oneself is not expressly guaranteed. This legal right is intended to protect the accused/ the defendant against the authorities’ abusive coercion. The scope of the right not to incriminate oneself is related to criminal matter under the Convention, and thus susceptible or applicable to criminal proceedings concerning all types of crimes as a guarantee to a fair trial. The European Court of Justice ruled that despite the fact that art. 6 paragraph 2 of the Convention does not expressly mention the right not to incriminate oneself and the right not to contribute to their own incrimination (nemo tenetur are ipsum accusare these are generally recognized international rules that are in consistence with the notion of “fair trial” stipulated in art. 6. By virtue of the right to silence, the person charged with a crime is free to answer the questions or not, as he/she believes it is in his/her interest. Therefore, the right to silence involves not only the right not to testify against oneself, but also the right of the accused/ defendant not to incriminate oneself. Thus, the accused/defendant cannot be compelled to assist in the production of evidence and cannot be sanctioned for failing to provide certain documents or other evidence. Obligation to testify against personal will, under the constraint of a fine or any other form of coercion constitutes an interference with the negative aspect of the right to freedom of expression which must be necessary in a democratic society. It is essential to clarify certain issues as far as this right is concerned. First of all, the statutory provision in question is specific to adversarial systems, which are found mainly in Anglo-Saxon countries and are totally different from that underlying the current Romanian Criminal

  14. AIDS, individual behaviour and the unexplained remaining variation.

    Katz, Alison

    2002-01-01

    From the start of the AIDS pandemic, individual behaviour has been put forward, implicitly or explicitly, as the main explanatory concept for understanding the epidemiology of HIV infection and in particular for the rapid spread and high prevalence in sub-Saharan Africa. This has had enormous implications for the international response to AIDS and has heavily influenced public health policy and strategy and the design of prevention and care interventions at national, community and individual level. It is argued that individual behaviour alone cannot possibly account for the enormous variation in HIV prevalence between population groups, countries and regions and that the unexplained remaining variation has been neglected by the international AIDS community. Biological vulnerability to HIV due to seriously deficient immune systems has been ignored as a determinant of the high levels of infection in certain populations. This is in sharp contrast to well proven public health approaches to other infectious diseases. In particular, it is argued that poor nutrition and co-infection with the myriad of other diseases of poverty including tuberculosis, malaria, leishmaniasis and parasitic infections, have been neglected as root causes of susceptibility, infectiousness and high rates of transmission of HIV at the level of populations. Vulnerability in terms of non-biological factors such as labour migration, prostitution, exchange of sex for survival, population movements due to war and violence, has received some attention but the solutions proposed to these problems are also inappropriately focused on individual behaviour and suffer from the same neglect of economic and political root causes. As the foundation for the international community's response to the AIDS pandemic, explanations of HIV/AIDS epidemiology in terms of individual behaviour are not only grossly inadequate, they are highly stigmatising and may in some cases, be racist. They have diverted attention from

  15. Phenomenon of Uncertainty as a Subjective Experience

    Lifintseva A.A.

    2018-04-01

    Full Text Available The phenomenon of uncertainty in illness of patients is discussed and analyzed in this article. Uncertainty in illness is a condition that accompanies the patient from the moment of appearance of the first somatic symptoms of the disease and could be strengthened or weakened thanks to many psychosocial factors. The level of uncertainty is related to the level of stress, emotional disadaptation, affective states, coping strategies, mechanisms of psychological defense, etc. Uncertainty can perform destructive functions, acting as a trigger for stressful conditions and launching negative emotional experiences. As a positive function of uncertainty, one can note a possible positive interpretation of the patient's disease. In addition, the state of uncertainty allows the patient to activate the resources of coping with the disease, among which the leading role belongs to social support.

  16. Visual Semiotics & Uncertainty Visualization: An Empirical Study.

    MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M

    2012-12-01

    This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.

  17. Uncertainty quantification theory, implementation, and applications

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  18. Report on the uncertainty methods study

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  19. Communicating the Uncertainty in Greenhouse Gas Emissions from Agriculture

    Milne, Alice; Glendining, Margaret; Perryman, Sarah; Whitmore, Andy

    2014-05-01

    inventory. Box plots were favoured by a majority of our participants but this result was driven by those with a better understanding of maths. We concluded that the methods chosen to communicate uncertainty in greenhouse gas emissions should be influenced by professional and mathematical background of the end-user. We propose that boxplots annotated with summary statistics such as mean, median, 2.5th and 97.5th percentiles provide a sound method for communicating uncertainty to research scientists as these individuals tend to be familiar with these methods. End-users from other groups may not be so familiar with these methods and so a combination of intuitive methods such as calibrated phrases and shaded arrays with numerate methods would be better suited. Ideally these individuals should be presented with the intuitive qualitative methods with the option to consider a more quantitative description, perhaps presented in an appendix.

  20. Uncertainties and demonstration of compliance with numerical risk standards

    Preyssl, C.; Cullingford, M.C.

    1987-01-01

    When dealing with numerical results of a probabilistic risk analysis performed for a complex system, such as a nuclear power plant, one major objective may be to deal with the problem of compliance or non-compliance with a prefixed risk standard. The uncertainties in the risk results associated with the consequences and their probabilities of occurrence may be considered by representing the risk as a risk band. Studying the area and distance between the upper and lower bound of the risk band provides consistent information on the uncertainties in terms of risk, not by means of scalars only but also by real functions. Criteria can be defined for determining compliance with a numerical risk standard, and the 'weighting functional' method, representing a possible tool for testing compliance of risk results, is introduced. By shifting the upper confidence bound due to redefinition, part of the risk band may exceed the standard without changing the underlying results. Using the concept described it is possible to determine the amount of risk, i.e. uncertainty, exceeding the standard. The mathematical treatment of uncertainties therefore allows probabilistic risk assessment results to be compared. A realistic example illustrates the method. (author)