WorldWideScience

Sample records for uncertainty simulating catastrophe

  1. Optimal natural resources management under uncertainty with catastrophic risk

    Energy Technology Data Exchange (ETDEWEB)

    Motoh, Tsujimura [Graduate School of Economics, Kyoto University, Yoshida-honmochi, Sakyo-ku, Kyoto 606-8501 (Japan)

    2004-05-01

    We examine an optimal natural resources management problem under uncertainty with catastrophic risk and investigate the optimal rate of use of a natural resource. For this purpose, we use stochastic control theory. We assume that, until a catastrophic event occurs, the stock of the natural resource is governed by a stochastic differential equation. We describe the catastrophic phenomenon as a Poisson process. From this analysis, we show the optimal rate of use of the natural resource in explicit form. Furthermore, we present comparative static results for the optimal rate of use of the natural resource.

  2. Optimal natural resources management under uncertainty with catastrophic risk

    International Nuclear Information System (INIS)

    Motoh, Tsujimura

    2004-01-01

    We examine an optimal natural resources management problem under uncertainty with catastrophic risk and investigate the optimal rate of use of a natural resource. For this purpose, we use stochastic control theory. We assume that, until a catastrophic event occurs, the stock of the natural resource is governed by a stochastic differential equation. We describe the catastrophic phenomenon as a Poisson process. From this analysis, we show the optimal rate of use of the natural resource in explicit form. Furthermore, we present comparative static results for the optimal rate of use of the natural resource

  3. The shape of uncertainty: underwriting decisions in the face of catastrophic risk

    International Nuclear Information System (INIS)

    Keykhah, M.

    1998-01-01

    This paper will explore how insurance and re-insurance underwriters price catastrophe risk from natural perils. It will first describe the theoretical nature of pricing risk, and outline studies of underwriting that propose analyzing decision making from a more behavioral than rational choice perspective. The paper then argues that in order to provide the appropriate context for probability (which is the focus of the studies on decision making under uncertainty), it may be helpful to look at the nature of choice within a market and organizational context. Moreover, the nature of probability itself is explored with a review to construct a broader analysis. Finally, it will be argued that the causal framework of the underwriter, in addition to inductive reasoning, provides a shape to uncertainty. (author)

  4. Penetration of n-hexadecane and water into wood under conditions simulating catastrophic floods

    Science.gov (United States)

    Ganna Baglayeva; Wayne S. Seames; Charles R. Frihart; Jane O' Dell; Evguenii I. Kozliak

    2017-01-01

    To simulate fuel oil spills occurring during catastrophic floods, short-term absorption of two chemicals, n-hexadecane (representative of semivolatile organic compounds in fuel oil) and water, into southern yellow pine was gravimetrically monitored as a function of time at ambient conditions. Different scenarios were run on the basis of (1) the...

  5. Uncertainty in Simulating Wheat Yields Under Climate Change

    Science.gov (United States)

    Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.; hide

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.

  6. The impact of possible climate catastrophes on global warming policy

    International Nuclear Information System (INIS)

    Baranzini, Andrea; Chesney, Marc; Morisset, Jacques

    2003-01-01

    Recent studies on global warming have introduced the inherent uncertainties associated with the costs and benefits of climate policies and have often shown that abatement policies are likely to be less aggressive or postponed in comparison to those resulting from traditional cost-benefit analyses (CBA). Yet, those studies have failed to include the possibility of sudden climate catastrophes. The aim of this paper is to account simultaneously for possible continuous and discrete damages resulting from global warming, and to analyse their implications on the optimal path of abatement policies. Our approach is related to the new literature on investment under uncertainty, and relies on some recent developments of the real option in which we incorporated negative jumps (climate catastrophes) in the stochastic process corresponding to the net benefits associated with the abatement policies. The impacts of continuous and discrete climatic risks can therefore be considered separately. Our numerical applications lead to two main conclusions: (i) gradual, continuous uncertainty in the global warming process is likely to delay the adoption of abatement policies as found in previous studies, with respect to the standard CBA; however (ii) the possibility of climate catastrophes accelerates the implementation of these policies as their net discounted benefits increase significantly

  7. Uncertainty in simulating wheat yields under climate change

    DEFF Research Database (Denmark)

    Asseng, A; Ewert, F; Rosenzweig, C

    2013-01-01

    of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models...... than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi...

  8. The impact of climate change on catastrophe risk models : implications for catastrophe risk markets in developing countries

    OpenAIRE

    Seo, John; Mahul, Olivier

    2009-01-01

    Catastrophe risk models allow insurers, reinsurers and governments to assess the risk of loss from catastrophic events, such as hurricanes. These models rely on computer technology and the latest earth and meteorological science information to generate thousands if not millions of simulated events. Recently observed hurricane activity, particularly in the 2004 and 2005 hurricane seasons, i...

  9. Parameter Uncertainty on AGCM-simulated Tropical Cyclones

    Science.gov (United States)

    He, F.

    2015-12-01

    This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.

  10. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  11. Assessment of SFR Wire Wrap Simulation Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Delchini, Marc-Olivier G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Popov, Emilian L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Swiler, Laura P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-30

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results

  12. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  13. Uncertainty in Simulating Wheat Yields Under Climate Change

    Energy Technology Data Exchange (ETDEWEB)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O' Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.

    2013-09-01

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.

  14. Insights into Participants' Behaviours in Educational Games, Simulations and Workshops: A Catastrophe Theory Application to Motivation.

    Science.gov (United States)

    Cryer, Patricia

    1988-01-01

    Develops models for participants' behaviors in games, simulations, and workshops based on Catastrophe Theory and Herzberg's two-factor theory of motivation. Examples are given of how these models can be used, both for describing and understanding the behaviors of individuals, and for eliciting insights into why participants behave as they do. (11…

  15. Effect of monthly areal rainfall uncertainty on streamflow simulation

    Science.gov (United States)

    Ndiritu, J. G.; Mkhize, N.

    2017-08-01

    Areal rainfall is mostly obtained from point rainfall measurements that are sparsely located and several studies have shown that this results in large areal rainfall uncertainties at the daily time step. However, water resources assessment is often carried out a monthly time step and streamflow simulation is usually an essential component of this assessment. This study set out to quantify monthly areal rainfall uncertainties and assess their effect on streamflow simulation. This was achieved by; i) quantifying areal rainfall uncertainties and using these to generate stochastic monthly areal rainfalls, and ii) finding out how the quality of monthly streamflow simulation and streamflow variability change if stochastic areal rainfalls are used instead of historic areal rainfalls. Tests on monthly rainfall uncertainty were carried out using data from two South African catchments while streamflow simulation was confined to one of them. A non-parametric model that had been applied at a daily time step was used for stochastic areal rainfall generation and the Pitman catchment model calibrated using the SCE-UA optimizer was used for streamflow simulation. 100 randomly-initialised calibration-validation runs using 100 stochastic areal rainfalls were compared with 100 runs obtained using the single historic areal rainfall series. By using 4 rain gauges alternately to obtain areal rainfall, the resulting differences in areal rainfall averaged to 20% of the mean monthly areal rainfall and rainfall uncertainty was therefore highly significant. Pitman model simulations obtained coefficient of efficiencies averaging 0.66 and 0.64 in calibration and validation using historic rainfalls while the respective values using stochastic areal rainfalls were 0.59 and 0.57. Average bias was less than 5% in all cases. The streamflow ranges using historic rainfalls averaged to 29% of the mean naturalised flow in calibration and validation and the respective average ranges using stochastic

  16. Quantification of uncertainties of modeling and simulation

    International Nuclear Information System (INIS)

    Ma Zhibo; Yin Jianwei

    2012-01-01

    The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)

  17. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  18. Fukushinobyl, the impossible catastrophe

    International Nuclear Information System (INIS)

    Boceno, Laurent

    2012-01-01

    With the emergence of variety of health and environmental crisis or catastrophes (Seveso, Bhopal, Chernobyl, AIDS, contaminated blood, mad cow, influenzas), the author proposes thoughts about the fact that it seems we are not in the era of industrial societies any longer, but in that of societies of risk. He more particularly focuses on Chernobyl and Fukushima to analyse how a social framework is built up to integrate forms of institutionalisation of multifaceted vulnerability, these institutional logics becoming latent social pathologies. In this respect, he more particularly discusses the catastrophic share of nuclear. He shows how what can be considered as a risk is socialised, dissimulated by priority, and then addresses the management of consequences of Chernobyl and how it is used to address the Japanese present situation. He notably outlines a kind of collusion between the WHO and the IAEA about nuclear issues. In his respect, he recalls a statement made by the WHO saying that, from a mental health point of view, the most satisfying solution for the future of pacific uses of nuclear energy would be the emergence of a new generation who would have learned to cope with ignorance and uncertainty

  19. Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease

    Science.gov (United States)

    Marsden, Alison

    2009-11-01

    Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.

  20. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  1. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  2. The use of sequential indicator simulation to characterize geostatistical uncertainty

    International Nuclear Information System (INIS)

    Hansen, K.M.

    1992-10-01

    Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It is recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds

  3. Systematic uncertainties on Monte Carlo simulation of lead based ADS

    International Nuclear Information System (INIS)

    Embid, M.; Fernandez, R.; Garcia-Sanz, J.M.; Gonzalez, E.

    1999-01-01

    Computer simulations of the neutronic behaviour of ADS systems foreseen for actinide and fission product transmutation are affected by many sources of systematic uncertainties, both from the nuclear data and by the methodology selected when applying the codes. Several actual ADS Monte Carlo simulations are presented, comparing different options both for the data and for the methodology, evaluating the relevance of the different uncertainties. (author)

  4. The validation of evacuation simulation models through the analysis of behavioural uncertainty

    International Nuclear Information System (INIS)

    Lovreglio, Ruggiero; Ronchi, Enrico; Borri, Dino

    2014-01-01

    Both experimental and simulation data on fire evacuation are influenced by a component of uncertainty caused by the impact of the unexplained variance in human behaviour, namely behavioural uncertainty (BU). Evacuation model validation studies should include the study of this type of uncertainty during the comparison of experiments and simulation results. An evacuation model validation procedure is introduced in this paper to study the impact of BU. This methodology is presented through a case study for the comparison between repeated experimental data and simulation results produced by FDS+Evac, an evacuation model for the simulation of human behaviour in fire, which makes use of distribution laws. - Highlights: • Validation of evacuation models is investigated. • Quantitative evaluation of behavioural uncertainty is performed. • A validation procedure is presented through an evacuation case study

  5. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    Directory of Open Access Journals (Sweden)

    Artem Yankov

    2012-01-01

    Full Text Available For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor and in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.

  6. Simulation of the catastrophic floods caused by extreme rainfall events - Uh River basin case study

    OpenAIRE

    Pekárová, Pavla; Halmová, Dana; Mitková, Veronika

    2005-01-01

    The extreme rainfall events in Central and East Europe on August 2002 rise the question, how other basins would respond on such rainfall situations. Such theorisation helps us to arrange in advance the necessary activity in the basin to reduce the consequence of the assumed disaster. The aim of the study is to recognise a reaction of the Uh River basin (Slovakia, Ukraine) to the simulated catastrophic rainfall events from August 2002. Two precipitation scenarios, sc1 and sc2, were created. Th...

  7. Expected utility and catastrophic risk in a stochastic economy-climate model

    Energy Technology Data Exchange (ETDEWEB)

    Ikefuji, M. [Institute of Social and Economic Research, Osaka University, Osaka (Japan); Laeven, R.J.A.; Magnus, J.R. [Department of Econometrics and Operations Research, Tilburg University, Tilburg (Netherlands); Muris, C. [CentER, Tilburg University, Tilburg (Netherlands)

    2010-11-15

    In the context of extreme climate change, we ask how to conduct expected utility analysis in the presence of catastrophic risks. Economists typically model decision making under risk and uncertainty by expected utility with constant relative risk aversion (power utility); statisticians typically model economic catastrophes by probability distributions with heavy tails. Unfortunately, the expected utility framework is fragile with respect to heavy-tailed distributional assumptions. We specify a stochastic economy-climate model with power utility and explicitly demonstrate this fragility. We derive necessary and sufficient compatibility conditions on the utility function to avoid fragility and solve our stochastic economy-climate model for two examples of such compatible utility functions. We further develop and implement a procedure to learn the input parameters of our model and show that the model thus specified produces quite robust optimal policies. The numerical results indicate that higher levels of uncertainty (heavier tails) lead to less abatement and consumption, and to more investment, but this effect is not unlimited.

  8. Wave Energy Converter Annual Energy Production Uncertainty Using Simulations

    Directory of Open Access Journals (Sweden)

    Clayton E. Hiles

    2016-09-01

    Full Text Available Critical to evaluating the economic viability of a wave energy project is: (1 a robust estimate of the electricity production throughout the project lifetime and (2 an understanding of the uncertainty associated with said estimate. Standardization efforts have established mean annual energy production (MAEP as the metric for quantification of wave energy converter (WEC electricity production and the performance matrix approach as the appropriate method for calculation. General acceptance of a method for calculating the MAEP uncertainty has not yet been achieved. Several authors have proposed methods based on the standard engineering approach to error propagation, however, a lack of available WEC deployment data has restricted testing of these methods. In this work the magnitude and sensitivity of MAEP uncertainty is investigated. The analysis is driven by data from simulated deployments of 2 WECs of different operating principle at 4 different locations. A Monte Carlo simulation approach is proposed for calculating the variability of MAEP estimates and is used to explore the sensitivity of the calculation. The uncertainty of MAEP ranged from 2%–20% of the mean value. Of the contributing uncertainties studied, the variability in the wave climate was found responsible for most of the uncertainty in MAEP. Uncertainty in MAEP differs considerably between WEC types and between deployment locations and is sensitive to the length of the input data-sets. This implies that if a certain maximum level of uncertainty in MAEP is targeted, the minimum required lengths of the input data-sets will be different for every WEC-location combination.

  9. Catastrophes control problem

    International Nuclear Information System (INIS)

    Velichenko, V.V.

    1994-01-01

    The problem of catastrophe control is discussed. Catastrophe control aims to withdraw responsible engineering constructions out of the catastrophe. The mathematical framework of catastrophes control systems is constructed. It determines the principles of systems filling by the concrete physical contents and, simultaneously, permits to employ modern control methods for the synthesis of optimal withdrawal strategy for protected objects

  10. Sketching Uncertainty into Simulations.

    Science.gov (United States)

    Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E

    2012-12-01

    In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.

  11. Quantification of Uncertainty in Thermal Building Simulation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Haghighat, F.; Frier, Christian

    In order to quantify uncertainty in thermal building simulation stochastic modelling is applied on a building model. An application of stochastic differential equations is presented in Part 1 comprising a general heat balance for an arbitrary number of loads and zones in a building to determine...

  12. Multimodel Uncertainty Changes in Simulated River Flows Induced by Human Impact Parameterizations

    Science.gov (United States)

    Liu, Xingcai; Tang, Qiuhong; Cui, Huijuan; Mu, Mengfei; Gerten Dieter; Gosling, Simon; Masaki, Yoshimitsu; Satoh, Yusuke; Wada, Yoshihide

    2017-01-01

    Human impacts increasingly affect the global hydrological cycle and indeed dominate hydrological changes in some regions. Hydrologists have sought to identify the human-impact-induced hydrological variations via parameterizing anthropogenic water uses in global hydrological models (GHMs). The consequently increased model complexity is likely to introduce additional uncertainty among GHMs. Here, using four GHMs, between-model uncertainties are quantified in terms of the ratio of signal to noise (SNR) for average river flow during 1971-2000 simulated in two experiments, with representation of human impacts (VARSOC) and without (NOSOC). It is the first quantitative investigation of between-model uncertainty resulted from the inclusion of human impact parameterizations. Results show that the between-model uncertainties in terms of SNRs in the VARSOC annual flow are larger (about 2 for global and varied magnitude for different basins) than those in the NOSOC, which are particularly significant in most areas of Asia and northern areas to the Mediterranean Sea. The SNR differences are mostly negative (-20 to 5, indicating higher uncertainty) for basin-averaged annual flow. The VARSOC high flow shows slightly lower uncertainties than NOSOC simulations, with SNR differences mostly ranging from -20 to 20. The uncertainty differences between the two experiments are significantly related to the fraction of irrigation areas of basins. The large additional uncertainties in VARSOC simulations introduced by the inclusion of parameterizations of human impacts raise the urgent need of GHMs development regarding a better understanding of human impacts. Differences in the parameterizations of irrigation, reservoir regulation and water withdrawals are discussed towards potential directions of improvements for future GHM development. We also discuss the advantages of statistical approaches to reduce the between-model uncertainties, and the importance of calibration of GHMs for not only

  13. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    Science.gov (United States)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  14. Assessing the Uncertainty of Tropical Cyclone Simulations in NCAR's Community Atmosphere Model

    Directory of Open Access Journals (Sweden)

    Kevin A Reed

    2011-08-01

    Full Text Available The paper explores the impact of the initial-data, parameter and structural model uncertainty on the simulation of a tropical cyclone-like vortex in the National Center for Atmospheric Research's (NCAR Community Atmosphere Model (CAM. An analytic technique is used to initialize the model with an idealized weak vortex that develops into a tropical cyclone over ten simulation days. A total of 78 ensemble simulations are performed at horizontal grid spacings of 1.0°, 0.5° and 0.25° using two recently released versions of the model, CAM 4 and CAM 5. The ensemble members represent simulations with random small-amplitude perturbations of the initial conditions, small shifts in the longitudinal position of the initial vortex and runs with slightly altered model parameters. The main distinction between CAM 4 and CAM 5 lies within the physical parameterization suite, and the simulations with both CAM versions at the varying resolutions assess the structural model uncertainty. At all resolutions storms are produced with many tropical cyclone-like characteristics. The CAM 5 simulations exhibit more intense storms than CAM 4 by day 10 at the 0.5° and 0.25° grid spacings, while the CAM 4 storm at 1.0° is stronger. There are also distinct differences in the shapes and vertical profiles of the storms in the two variants of CAM. The ensemble members show no distinction between the initial-data and parameter uncertainty simulations. At day 10 they produce ensemble root-mean-square deviations from an unperturbed control simulation on the order of 1--5 m s-1 for the maximum low-level wind speed and 2--10 hPa for the minimum surface pressure. However, there are large differences between the two CAM versions at identical horizontal resolutions. It suggests that the structural uncertainty is more dominant than the initial-data and parameter uncertainties in this study. The uncertainty among the ensemble members is assessed and quantified.

  15. Laboratory tests of catastrophic disruption of rotating bodies

    Science.gov (United States)

    Morris, A. J. W.; Burchell, M. J.

    2017-11-01

    The results of catastrophic disruption experiments on static and rotating targets are reported. The experiments used cement spheres of diameter 10 cm as the targets. Impacts were by mm sized stainless steel spheres at speeds of between 1 and 7.75 km s-1. Energy densities (Q) in the targets ranged from 7 to 2613 J kg-1. The experiments covered both the cratering and catastrophic disruption regimes. For static, i.e. non-rotating targets the critical energy density for disruption (Q*, the value of Q when the largest surviving target fragment has a mass equal to one half of the pre-impact target mass) was Q* = 1447 ± 90 J kg-1. For rotating targets (median rotation frequency of 3.44 Hz) we found Q* = 987 ± 349 J kg-1, a reduction of 32% in the mean value. This lower value of Q* for rotating targets was also accompanied by a larger scatter on the data, hence the greater uncertainty. We suggest that in some cases the rotating targets behaved as static targets, i.e. broke up with the same catastrophic disruption threshold, but in other cases the rotation helped the break up causing a lower catastrophic disruption threshold, hence both the lower value of Q* and the larger scatter on the data. The fragment mass distributions after impact were similar in both the static and rotating target experiments with similar slopes.

  16. Modeling, numerical simulation, and nonlinear dynamic behavior analysis of PV microgrid-connected inverter with capacitance catastrophe

    Science.gov (United States)

    Li, Sichen; Liao, Zhixian; Luo, Xiaoshu; Wei, Duqu; Jiang, Pinqun; Jiang, Qinghong

    2018-02-01

    The value of the output capacitance (C) should be carefully considered when designing a photovoltaic (PV) inverter since it can cause distortion in the working state of the circuit, and the circuit produces nonlinear dynamic behavior. According to Kirchhoff’s laws and the characteristics of an ideal operational amplifier for a strict piecewise linear state equation, a circuit simulation model is constructed to study the system parameters (time, C) for the current passing through an inductor with an inductance of L and the voltage across the capacitor with a capacitance of C. The developed simulation model uses Runge-Kutta methods to solve the state equations. This study focuses on predicting the fault of the circuit from the two aspects of the harmonic distortion and simulation results. Moreover, the presented model is also used to research the working state of the system in the case of a load capacitance catastrophe. The nonlinear dynamic behaviors in the inverter are simulated and verified.

  17. "But it might be a heart attack" : intolerance of uncertainty and panic disorder symptoms

    NARCIS (Netherlands)

    Carleton, R Nicholas; Duranceau, Sophie; Freeston, Mark H; Boelen, Paul A|info:eu-repo/dai/nl/174011954; McCabe, Randi E; Antony, Martin M

    Panic disorder models describe interactions between feared anxiety-related physical sensations (i.e., anxiety sensitivity; AS) and catastrophic interpretations therein. Intolerance of uncertainty (IU) has been implicated as necessary for catastrophic interpretations in community samples. The current

  18. On sociological catastrophe analysis

    International Nuclear Information System (INIS)

    Clausen, L.

    1974-01-01

    The present paper deals with standard terms of sociological catastrophe theory hitherto existing, collective behaviour during the catastrophe, and consequences for the empiric catastrophe sociology. (RW) [de

  19. Measurement, simulation and uncertainty assessment of implant heating during MRI

    International Nuclear Information System (INIS)

    Neufeld, E; Kuehn, S; Kuster, N; Szekely, G

    2009-01-01

    The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.

  20. Measurement, simulation and uncertainty assessment of implant heating during MRI

    Energy Technology Data Exchange (ETDEWEB)

    Neufeld, E; Kuehn, S; Kuster, N [Foundation for Research on Information Technologies in Society (IT' IS), Zeughausstr. 43, 8004 Zurich (Switzerland); Szekely, G [Computer Vision Laboratory, Swiss Federal Institute of Technology (ETHZ), Sternwartstr 7, ETH Zentrum, 8092 Zurich (Switzerland)], E-mail: neufeld@itis.ethz.ch

    2009-07-07

    The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.

  1. Uncertainty of input data for room acoustic simulations

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho; Marbjerg, Gerd; Brunskog, Jonas

    2016-01-01

    Although many room acoustic simulation models have been well established, simulation results will never be accurate with inaccurate and uncertain input data. This study addresses inappropriateness and uncertainty of input data for room acoustic simulations. Firstly, the random incidence absorption...... and scattering coefficients are insufficient when simulating highly non-diffuse rooms. More detailed information, such as the phase and angle dependence, can greatly improve the simulation results of pressure-based geometrical and wave-based models at frequencies well below the Schroeder frequency. Phase...... summarizes potential advanced absorption measurement techniques that can improve the quality of input data for room acoustic simulations. Lastly, plenty of uncertain input data are copied from unreliable sources. Software developers and users should be careful when spreading such uncertain input data. More...

  2. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    NARCIS (Netherlands)

    Machguth, H.; Purves, R.S.; Oerlemans, J.; Hoelzle, M.; Paul, F.

    2008-01-01

    By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was

  3. Catastrophic Failure and Critical Scaling Laws of Fiber Bundle Material

    Directory of Open Access Journals (Sweden)

    Shengwang Hao

    2017-05-01

    Full Text Available This paper presents a spring-fiber bundle model used to describe the failure process induced by energy release in heterogeneous materials. The conditions that induce catastrophic failure are determined by geometric conditions and energy equilibrium. It is revealed that the relative rates of deformation of, and damage to the fiber bundle with respect to the boundary controlling displacement ε0 exhibit universal power law behavior near the catastrophic point, with a critical exponent of −1/2. The proportion of the rate of response with respect to acceleration exhibits a linear relationship with increasing displacement in the vicinity of the catastrophic point. This allows for the prediction of catastrophic failure immediately prior to failure by extrapolating the trajectory of this relationship as it asymptotes to zero. Monte Carlo simulations are completed and these two critical scaling laws are confirmed.

  4. Event-by-event simulation of single-neutron experiments to test uncertainty relations

    International Nuclear Information System (INIS)

    Raedt, H De; Michielsen, K

    2014-01-01

    Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not require the knowledge of the solution of a wave equation, nor does it rely on detailed concepts of quantum theory. In particular, the data from these non-quantum simulations satisfy uncertainty relations derived in the context of quantum theory. (paper)

  5. Non-catastrophic and catastrophic fractures in racing Thoroughbreds at the Hong Kong Jockey Club.

    Science.gov (United States)

    Sun, T C; Riggs, C M; Cogger, N; Wright, J; Al-Alawneh, J I

    2018-04-19

    Reports of fractures in racehorses have predominantly focused on catastrophic injuries, and there is limited data identifying the location and incidence of fractures that did not result in a fatal outcome. To describe the nature and the incidence of non-catastrophic and catastrophic fractures in Thoroughbreds racing at the Hong Kong Jockey Club (HKJC) over seven racing seasons. Retrospective cohort study. Data of fractures sustained in horses while racing and of race characteristics were extracted from the HKJC Veterinary Management Information System (VMIS) and Racing Information System (RIS) respectively. The fracture event was determined from the first clinical entry for each specific injury. The incidence rates of non-catastrophic and catastrophic fractures were calculated per 1000 racing starts for racetrack, age, racing season, sex and trainer. 179 first fracture events occurred in 64,807 racing starts. The incidence rate of non-catastrophic fractures was 2.2 per 1000 racing starts and of catastrophic fractures was 0.6 per 1000 racing starts. Fractures of the proximal sesamoid bones represented 55% of all catastrophic fractures while the most common non-catastrophic fractures involved the carpus and the first phalanx. Significant associations were detected between the incidence of non-catastrophic fractures and sex, trainer and racing season. The first fracture event was used to calculate the incidence rate in this study and may have resulted in underestimation of the true incidence rate of fractures in this population. However, given the low number of recorded fracture events compared to the size of the study population, this underestimation is likely to be small. There were 3.6 times as many non-catastrophic fractures as catastrophic fractures in Thoroughbreds racing in Hong Kong between 2004 and 2011. Non-catastrophic fractures interfere with race training schedules and may predispose to catastrophic fracture. Future analytical studies on non-catastrophic

  6. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  7. Seizing Catastrophes

    DEFF Research Database (Denmark)

    Kublitz, Anja

    2013-01-01

    to a distant past but takes place in the present. They use the term Nakba not only to refer to the catastrophe of 1948 but also to designate current catastrophes, such as the Danish Muhammad cartoons affair in 2005 and the Israeli invasion of Gaza in 2008. Through an analysis of the 60th commemoration...

  8. Uncertainty-based simulation-optimization using Gaussian process emulation: Application to coastal groundwater management

    Science.gov (United States)

    Rajabi, Mohammad Mahdi; Ketabchi, Hamed

    2017-12-01

    Combined simulation-optimization (S/O) schemes have long been recognized as a valuable tool in coastal groundwater management (CGM). However, previous applications have mostly relied on deterministic seawater intrusion (SWI) simulations. This is a questionable simplification, knowing that SWI models are inevitably prone to epistemic and aleatory uncertainty, and hence a management strategy obtained through S/O without consideration of uncertainty may result in significantly different real-world outcomes than expected. However, two key issues have hindered the use of uncertainty-based S/O schemes in CGM, which are addressed in this paper. The first issue is how to solve the computational challenges resulting from the need to perform massive numbers of simulations. The second issue is how the management problem is formulated in presence of uncertainty. We propose the use of Gaussian process (GP) emulation as a valuable tool in solving the computational challenges of uncertainty-based S/O in CGM. We apply GP emulation to the case study of Kish Island (located in the Persian Gulf) using an uncertainty-based S/O algorithm which relies on continuous ant colony optimization and Monte Carlo simulation. In doing so, we show that GP emulation can provide an acceptable level of accuracy, with no bias and low statistical dispersion, while tremendously reducing the computational time. Moreover, five new formulations for uncertainty-based S/O are presented based on concepts such as energy distances, prediction intervals and probabilities of SWI occurrence. We analyze the proposed formulations with respect to their resulting optimized solutions, the sensitivity of the solutions to the intended reliability levels, and the variations resulting from repeated optimization runs.

  9. Climate catastrophes

    Science.gov (United States)

    Budyko, Mikhail

    1999-05-01

    Climate catastrophes, which many times occurred in the geological past, caused the extinction of large or small populations of animals and plants. Changes in the terrestrial and marine biota caused by the catastrophic climate changes undoubtedly resulted in considerable fluctuations in global carbon cycle and atmospheric gas composition. Primarily, carbon dioxide and other greenhouse gas contents were affected. The study of these catastrophes allows a conclusion that climate system is very sensitive to relatively small changes in climate-forcing factors (transparency of the atmosphere, changes in large glaciations, etc.). It is important to take this conclusion into account while estimating the possible consequences of now occurring anthropogenic warming caused by the increase in greenhouse gas concentration in the atmosphere.

  10. Error correction in multi-fidelity molecular dynamics simulations using functional uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Reeve, Samuel Temple; Strachan, Alejandro, E-mail: strachan@purdue.edu

    2017-04-01

    We use functional, Fréchet, derivatives to quantify how thermodynamic outputs of a molecular dynamics (MD) simulation depend on the potential used to compute atomic interactions. Our approach quantifies the sensitivity of the quantities of interest with respect to the input functions as opposed to its parameters as is done in typical uncertainty quantification methods. We show that the functional sensitivity of the average potential energy and pressure in isothermal, isochoric MD simulations using Lennard–Jones two-body interactions can be used to accurately predict those properties for other interatomic potentials (with different functional forms) without re-running the simulations. This is demonstrated under three different thermodynamic conditions, namely a crystal at room temperature, a liquid at ambient pressure, and a high pressure liquid. The method provides accurate predictions as long as the change in potential can be reasonably described to first order and does not significantly affect the region in phase space explored by the simulation. The functional uncertainty quantification approach can be used to estimate the uncertainties associated with constitutive models used in the simulation and to correct predictions if a more accurate representation becomes available.

  11. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  12. Direct catastrophic injury in sports.

    Science.gov (United States)

    Boden, Barry P

    2005-11-01

    Catastrophic sports injuries are rare but tragic events. Direct (traumatic) catastrophic injury results from participating in the skills of a sport, such as a collision in football. Football is associated with the greatest number of direct catastrophic injuries for all major team sports in the United States. Pole vaulting, gymnastics, ice hockey, and football have the highest incidence of direct catastrophic injuries for sports in which males participate. In most sports, the rate of catastrophic injury is higher at the collegiate than at the high school level. Cheerleading is associated with the highest number of direct catastrophic injuries for all sports in which females participate. Indirect (nontraumatic) injury is caused by systemic failure as a result of exertion while participating in a sport. Cardiovascular conditions, heat illness, exertional hyponatremia, and dehydration can cause indirect catastrophic injury. Understanding the common mechanisms of injury and prevention strategies for direct catastrophic injuries is critical in caring for athletes.

  13. An application of Mean Escape Time and metapopulation on forestry catastrophe insurance

    Science.gov (United States)

    Li, Jiangcheng; Zhang, Chunmin; Liu, Jifa; Li, Zhen; Yang, Xuan

    2018-04-01

    A forestry catastrophe insurance model due to forestry pest infestations and disease epidemics is developed by employing metapopulation dynamics and statistics properties of Mean Escape Time (MET). The probability of outbreak of forestry catastrophe loss and the catastrophe loss payment time with MET are respectively investigated. Forestry loss data in China is used for model simulation. Experimental results are concluded as: (1) The model with analytical results is shown to be a better fit; (2) Within the condition of big area of patches and structure of patches, high system factor, low extinction rate, high multiplicative noises, and additive noises with a high cross-correlated strength range, an outbreak of forestry catastrophe loss or catastrophe loss payment due to forestry pest infestations and disease epidemics could occur; (3) An optimal catastrophe loss payment time MET due to forestry pest infestations and disease epidemics can be identified by taking proper value of multiplicative noises and limits the additive noises on a low range of value, and cross-correlated strength at a high range of value.

  14. Quantifying chemical uncertainties in simulations of the ISM

    Science.gov (United States)

    Glover, Simon

    2018-06-01

    The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.

  15. Uncertainty quantification in ion–solid interaction simulations

    Energy Technology Data Exchange (ETDEWEB)

    Preuss, R., E-mail: preuss@ipp.mpg.de; Toussaint, U. von

    2017-02-15

    Within the framework of Bayesian uncertainty quantification we propose a non-intrusive reduced-order spectral approach (polynomial chaos expansion) to the simulation of ion–solid interactions. The method not only reduces the number of function evaluations but provides simultaneously a quantitative measure for which combinations of inputs have the most important impact on the result. It is applied to SDTRIM-simulations (Möller et al., 1988) with several uncertain and Gaussian distributed input parameters (i.e. angle, projectile energy, surface binding energy, target composition) and the results are compared to full-grid based approaches and sampling based methods with respect to reliability, efficiency and scalability.

  16. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  17. Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations

    Science.gov (United States)

    Niemeier, Wolfgang; Tengen, Dieter

    2017-06-01

    In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.

  18. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  19. Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2

    Science.gov (United States)

    Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.; hide

    2016-01-01

    Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.

  20. Uncertainty in Measurement: A Review of Monte Carlo Simulation Using Microsoft Excel for the Calculation of Uncertainties Through Functional Relationships, Including Uncertainties in Empirically Derived Constants

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-01-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional

  1. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  2. Uncertainties in the simulation of groundwater recharge at different scales

    Directory of Open Access Journals (Sweden)

    H. Bogena

    2005-01-01

    Full Text Available Digital spatial data always imply some kind of uncertainty. The source of this uncertainty can be found in their compilation as well as the conceptual design that causes a more or less exact abstraction of the real world, depending on the scale under consideration. Within the framework of hydrological modelling, in which numerous data sets from diverse sources of uneven quality are combined, the various uncertainties are accumulated. In this study, the GROWA model is taken as an example to examine the effects of different types of uncertainties on the calculated groundwater recharge. Distributed input errors are determined for the parameters' slope and aspect using a Monte Carlo approach. Landcover classification uncertainties are analysed by using the conditional probabilities of a remote sensing classification procedure. The uncertainties of data ensembles at different scales and study areas are discussed. The present uncertainty analysis showed that the Gaussian error propagation method is a useful technique for analysing the influence of input data on the simulated groundwater recharge. The uncertainties involved in the land use classification procedure and the digital elevation model can be significant in some parts of the study area. However, for the specific model used in this study it was shown that the precipitation uncertainties have the greatest impact on the total groundwater recharge error.

  3. Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations

    International Nuclear Information System (INIS)

    Shaukata, Nadeem; Shim, Hyung Jin

    2015-01-01

    In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of

  4. Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Shaukata, Nadeem; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2015-10-15

    In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of

  5. Assessment of effectiveness of geologic isolation systems. Geologic factors in the isolation of nuclear waste: evaluation of long-term geomorphic processes and catastrophic events

    International Nuclear Information System (INIS)

    Mara, S.J.

    1980-03-01

    SRI International has projected the rate, duration, and magnitude of geomorphic processes and events in the Southwest and Gulf Coast over the next million years. This information will be used by the Department of Energy's Pacific Northwest Laboratory (PNL) as input to a computer model, which will be used to simulate possible release scenarios and the consequences of the release of nuclear waste from geologic containment. The estimates in this report, although based on best scientific judgment, are subject to considerable uncertainty. An evaluation of the Quaternary history of the two study areas revealed that each had undergone geomorphic change in the last one million years. Catastrophic events were evaluated in order to determine their significance to the simulation model. Given available data, catastrophic floods are not expected to occur in the two study areas. Catastrophic landslides may occur in the Southwest, but because the duration of the event is brief and the amount of material moved is small in comparison to regional denudation, such events need not be included in the simulation model. Ashfalls, however, could result in removal of vegetation from the landscape, thereby causing significant increases in erosion rates. Because the estimates developed during this study may not be applicable to specific sites, general equations were presented as a first step in refining the analysis. These equations identify the general relationships among the important variables and suggest those areas of concern for which further data are required. If the current model indicates that geomorphic processes (taken together with other geologic changes) may ultimately affect the geologic containment of nuclear waste, further research may be necessary to refine this analysis for application to specific sites

  6. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes

    International Nuclear Information System (INIS)

    Garcia J, T.; Cardenas V, J.

    2015-09-01

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  7. The Ongoing Catastrophe

    DEFF Research Database (Denmark)

    Kublitz, Anja

    2015-01-01

    as camps. Based on fieldwork among Palestinians in the Danish camps, this article explores why my interlocutors describe their current lives as a catastrophe. Al-Nakba literally means the catastrophe and, in Palestinian national discourse, it is used to designate the event of 1948, when the Palestinians...

  8. Epistemic and aleatory uncertainties in integrated deterministic and probabilistic safety assessment: Tradeoff between accuracy and accident simulations

    International Nuclear Information System (INIS)

    Karanki, D.R.; Rahman, S.; Dang, V.N.; Zerkak, O.

    2017-01-01

    The coupling of plant simulation models and stochastic models representing failure events in Dynamic Event Trees (DET) is a framework used to model the dynamic interactions among physical processes, equipment failures, and operator responses. The integration of physical and stochastic models may additionally enhance the treatment of uncertainties. Probabilistic Safety Assessments as currently implemented propagate the (epistemic) uncertainties in failure probabilities, rates, and frequencies; while the uncertainties in the physical model (parameters) are not propagated. The coupling of deterministic (physical) and probabilistic models in integrated simulations such as DET allows both types of uncertainties to be considered. However, integrated accident simulations with epistemic uncertainties will challenge even today's high performance computing infrastructure, especially for simulations of inherently complex nuclear or chemical plants. Conversely, intentionally limiting computations for practical reasons would compromise accuracy of results. This work investigates how to tradeoff accuracy and computations to quantify risk in light of both uncertainties and accident dynamics. A simple depleting tank problem that can be solved analytically is considered to examine the adequacy of a discrete DET approach. The results show that optimal allocation of computational resources between epistemic and aleatory calculations by means of convergence studies ensures accuracy within a limited budget. - Highlights: • Accident simulations considering uncertainties require intensive computations. • Tradeoff between accuracy and accident simulations is a challenge. • Optimal allocation between epistemic & aleatory computations ensures the tradeoff. • Online convergence gives an early indication of computational requirements. • Uncertainty propagation in DDET is examined on a tank problem solved analytically.

  9. Effects of Boron and Graphite Uncertainty in Fuel for TREAT Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Vaughn, Kyle; Mausolff, Zander; Gonzalez, Esteban; DeHart, Mark; Goluoglu, Sedat

    2017-03-01

    Advanced modeling techniques and current computational capacity make full core TREAT simulations possible, with the goal of such simulations to understand the pre-test core and minimize the number of required calibrations. But, in order to simulate TREAT with a high degree of precision the reactor materials and geometry must also be modeled with a high degree of precision. This paper examines how uncertainty in the reported values of boron and graphite have an effect on simulations of TREAT.

  10. Strategic reasoning and bargaining in catastrophic climate change games

    Science.gov (United States)

    Verendel, Vilhelm; Johansson, Daniel J. A.; Lindgren, Kristian

    2016-03-01

    Two decades of international negotiations show that agreeing on emission levels for climate change mitigation is a hard challenge. However, if early warning signals were to show an upcoming tipping point with catastrophic damage, theory and experiments suggest this could simplify collective action to reduce greenhouse gas emissions. At the actual threshold, no country would have a free-ride incentive to increase emissions over the tipping point, but it remains for countries to negotiate their emission levels to reach these agreements. We model agents bargaining for emission levels using strategic reasoning to predict emission bids by others and ask how this affects the possibility of reaching agreements that avoid catastrophic damage. It is known that policy elites often use a higher degree of strategic reasoning, and in our model this increases the risk for climate catastrophe. Moreover, some forms of higher strategic reasoning make agreements to reduce greenhouse gases unstable. We use empirically informed levels of strategic reasoning when simulating the model.

  11. Pricing Zero-Coupon Catastrophe Bonds Using EVT with Doubly Stochastic Poisson Arrivals

    Directory of Open Access Journals (Sweden)

    Zonggang Ma

    2017-01-01

    Full Text Available The frequency and severity of climate abnormal change displays an irregular upward cycle as global warming intensifies. Therefore, this paper employs a doubly stochastic Poisson process with Black Derman Toy (BDT intensity to describe the catastrophic characteristics. By using the Property Claim Services (PCS loss index data from 2001 to 2010 provided by the US Insurance Services Office (ISO, the empirical result reveals that the BDT arrival rate process is superior to the nonhomogeneous Poisson and lognormal intensity process due to its smaller RMSE, MAE, MRPE, and U and larger E and d. Secondly, to depict extreme features of catastrophic risks, this paper adopts the Peak Over Threshold (POT in extreme value theory (EVT to characterize the tail characteristics of catastrophic loss distribution. And then the loss distribution is analyzed and assessed using a quantile-quantile (QQ plot to visually check whether the PCS index observations meet the generalized Pareto distribution (GPD assumption. Furthermore, this paper derives a pricing formula for zero-coupon catastrophe bonds with a stochastic interest rate environment and aggregate losses generated by a compound doubly stochastic Poisson process under the forward measure. Finally, simulation results verify pricing model predictions and show how catastrophic risks and interest rate risk affect the prices of zero-coupon catastrophe bonds.

  12. Uncertainties of flood frequency estimation approaches based on continuous simulation using data resampling

    Science.gov (United States)

    Arnaud, Patrick; Cantet, Philippe; Odry, Jean

    2017-11-01

    Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with

  13. Simulating space-time uncertainty in continental-scale gridded precipitation fields for agrometeorological modelling

    NARCIS (Netherlands)

    Wit, de A.J.W.; Bruin, de S.

    2006-01-01

    Previous analyses of the effects of uncertainty in precipitation fields on the output of EU Crop Growth Monitoring System (CGMS) demonstrated that the influence on simulated crop yield was limited at national scale, but considerable at local and regional scales. We aim to propagate uncertainty due

  14. Quantifying uncertainty due to internal variability using high-resolution regional climate model simulations

    Science.gov (United States)

    Gutmann, E. D.; Ikeda, K.; Deser, C.; Rasmussen, R.; Clark, M. P.; Arnold, J. R.

    2015-12-01

    The uncertainty in future climate predictions is as large or larger than the mean climate change signal. As such, any predictions of future climate need to incorporate and quantify the sources of this uncertainty. One of the largest sources comes from the internal, chaotic, variability within the climate system itself. This variability has been approximated using the 30 ensemble members of the Community Earth System Model (CESM) large ensemble. Here we examine the wet and dry end members of this ensemble for cool-season precipitation in the Colorado Rocky Mountains with a set of high-resolution regional climate model simulations. We have used the Weather Research and Forecasting model (WRF) to simulate the periods 1990-2000, 2025-2035, and 2070-2080 on a 4km grid. These simulations show that the broad patterns of change depicted in CESM are inherited by the high-resolution simulations; however, the differences in the height and location of the mountains in the WRF simulation, relative to the CESM simulation, means that the location and magnitude of the precipitation changes are very different. We further show that high-resolution simulations with the Intermediate Complexity Atmospheric Research model (ICAR) predict a similar spatial pattern in the change signal as WRF for these ensemble members. We then use ICAR to examine the rest of the CESM Large Ensemble as well as the uncertainty in the regional climate model due to the choice of physics parameterizations.

  15. Simulations of Collisional Disruption at the Catastrophic Impact Energy Threshold: Effect of the Target's Internal Structure and Diameter

    Science.gov (United States)

    Michel, P.; Benz, W.; Richardson, D. C.

    2005-08-01

    Recent simulations of asteroid break-ups, including both the fragmentation of the parent body and the gravitational interactions of the fragments, have allowed to reproduced successfully the main properties of asteroid families formed in different regimes of impact energy. Here, using the same kind of simulations, we concentrate on a single regime of impact energy, the so-called catastrophic threshold usually designated by Qcrit, which results in the escape of half of the target's mass. Considering a wide range of diameter values and two kinds of internal structures of the parent body, monolithic and pre-shattered, we analyse their potential influences on the value of Qcrit and on the collisional outcome limited here to the fragment size and ejection speed distributions, which are the main outcome properties used by collisional models to study the evolutions of the different populations of small bodies. For all the considered diameters and the two internal structures of the parent body, we confirm that the process of gravitational reaccumulation is at the origin of the largest remnant's mass. We then find that, for a given diameter of the parent body, the impact energy corresponding to the catastrophic disruption threshold is highly dependent on the internal structure of the parent body. In particular, a pre-shattered parent body containing only damaged zones but no macroscopic voids is easier to disrupt than a monolithic parent body. Other kinds of internal properties that can also characterize small bodies in real populations will be investigated in a future work.

  16. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1)

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predict...

  17. Advanced Approach to Consider Aleatory and Epistemic Uncertainties for Integral Accident Simulations

    International Nuclear Information System (INIS)

    Peschke, Joerg; Kloos, Martina

    2013-01-01

    The use of best-estimate codes together with realistic input data generally requires that all potentially important epistemic uncertainties which may affect the code prediction are considered in order to get an adequate quantification of the epistemic uncertainty of the prediction as an expression of the existing imprecise knowledge. To facilitate the performance of the required epistemic uncertainty analyses, methods and corresponding software tools are available like, for instance, the GRS-tool SUSA (Software for Uncertainty and Sensitivity Analysis). However, for risk-informed decision-making, the restriction on epistemic uncertainties alone is not enough. Transients and accident scenarios are also affected by aleatory uncertainties which are due to the unpredictable nature of phenomena. It is essential that aleatory uncertainties are taken into account as well, not only in a simplified and supposedly conservative way but as realistic as possible. The additional consideration of aleatory uncertainties, for instance, on the behavior of the technical system, the performance of plant operators, or on the behavior of the physical process provides a quantification of probabilistically significant accident sequences. Only if a safety analysis is able to account for both epistemic and aleatory uncertainties in a realistic manner, it can provide a well-founded risk-informed answer for decision-making. At GRS, an advanced probabilistic dynamics method was developed to address this problem and to provide a more realistic modeling and assessment of transients and accident scenarios. This method allows for an integral simulation of complex dynamic processes particularly taking into account interactions between the plant dynamics as simulated by a best-estimate code, the dynamics of operator actions and the influence of epistemic and aleatory uncertainties. In this paper, the GRS method MCDET (Monte Carlo Dynamic Event Tree) for probabilistic dynamics analysis is explained

  18. Parameter uncertainty in simulations of extreme precipitation and attribution studies.

    Science.gov (United States)

    Timmermans, B.; Collins, W. D.; O'Brien, T. A.; Risser, M. D.

    2017-12-01

    The attribution of extreme weather events, such as heavy rainfall, to anthropogenic influence involves the analysis of their probability in simulations of climate. The climate models used however, such as the Community Atmosphere Model (CAM), employ approximate physics that gives rise to "parameter uncertainty"—uncertainty about the most accurate or optimal values of numerical parameters within the model. In particular, approximate parameterisations for convective processes are well known to be influential in the simulation of precipitation extremes. Towards examining the impact of this source of uncertainty on attribution studies, we investigate the importance of components—through their associated tuning parameters—of parameterisations relating to deep and shallow convection, and cloud and aerosol microphysics in CAM. We hypothesise that as numerical resolution is increased the change in proportion of variance induced by perturbed parameters associated with the respective components is consistent with the decreasing applicability of the underlying hydrostatic assumptions. For example, that the relative influence of deep convection should diminish as resolution approaches that where convection can be resolved numerically ( 10 km). We quantify the relationship between the relative proportion of variance induced and numerical resolution by conducting computer experiments that examine precipitation extremes over the contiguous U.S. In order to mitigate the enormous computational burden of running ensembles of long climate simulations, we use variable-resolution CAM and employ both extreme value theory and surrogate modelling techniques ("emulators"). We discuss the implications of the relationship between parameterised convective processes and resolution both in the context of attribution studies and progression towards models that fully resolve convection.

  19. Structural Uncertainty in Antarctic sea ice simulations

    Science.gov (United States)

    Schneider, D. P.

    2016-12-01

    The inability of the vast majority of historical climate model simulations to reproduce the observed increase in Antarctic sea ice has motivated many studies about the quality of the observational record, the role of natural variability versus forced changes, and the possibility of missing or inadequate forcings in the models (such as freshwater discharge from thinning ice shelves or an inadequate magnitude of stratospheric ozone depletion). In this presentation I will highlight another source of uncertainty that has received comparatively little attention: Structural uncertainty, that is, the systematic uncertainty in simulated sea ice trends that arises from model physics and mean-state biases. Using two large ensembles of experiments from the Community Earth System Model (CESM), I will show that the model is predisposed towards producing negative Antarctic sea ice trends during 1979-present, and that this outcome is not simply because the model's decadal variability is out-of-synch with that in nature. In the "Tropical Pacific Pacemaker" ensemble, in which observed tropical Pacific SST anomalies are prescribed, the model produces very realistic atmospheric circulation trends over the Southern Ocean, yet the sea ice trend is negative in every ensemble member. However, if the ensemble-mean trend (commonly interpreted as the forced response) is removed, some ensemble members show a sea ice increase that is very similar to the observed. While this results does confirm the important role of natural variability, it also suggests a strong bias in the forced response. I will discuss the reasons for this systematic bias and explore possible remedies. This an important problem to solve because projections of 21st -Century changes in the Antarctic climate system (including ice sheet surface mass balance changes and related changes in the sea level budget) have a strong dependence on the mean state of and changes in the Antarctic sea ice cover. This problem is not unique to

  20. Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks

    Science.gov (United States)

    Clune, Jeff

    2017-01-01

    A long-term goal of AI is to produce agents that can learn a diversity of skills throughout their lifetimes and continuously improve those skills via experience. A longstanding obstacle towards that goal is catastrophic forgetting, which is when learning new information erases previously learned information. Catastrophic forgetting occurs in artificial neural networks (ANNs), which have fueled most recent advances in AI. A recent paper proposed that catastrophic forgetting in ANNs can be reduced by promoting modularity, which can limit forgetting by isolating task information to specific clusters of nodes and connections (functional modules). While the prior work did show that modular ANNs suffered less from catastrophic forgetting, it was not able to produce ANNs that possessed task-specific functional modules, thereby leaving the main theory regarding modularity and forgetting untested. We introduce diffusion-based neuromodulation, which simulates the release of diffusing, neuromodulatory chemicals within an ANN that can modulate (i.e. up or down regulate) learning in a spatial region. On the simple diagnostic problem from the prior work, diffusion-based neuromodulation 1) induces task-specific learning in groups of nodes and connections (task-specific localized learning), which 2) produces functional modules for each subtask, and 3) yields higher performance by eliminating catastrophic forgetting. Overall, our results suggest that diffusion-based neuromodulation promotes task-specific localized learning and functional modularity, which can help solve the challenging, but important problem of catastrophic forgetting. PMID:29145413

  1. Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks.

    Directory of Open Access Journals (Sweden)

    Roby Velez

    Full Text Available A long-term goal of AI is to produce agents that can learn a diversity of skills throughout their lifetimes and continuously improve those skills via experience. A longstanding obstacle towards that goal is catastrophic forgetting, which is when learning new information erases previously learned information. Catastrophic forgetting occurs in artificial neural networks (ANNs, which have fueled most recent advances in AI. A recent paper proposed that catastrophic forgetting in ANNs can be reduced by promoting modularity, which can limit forgetting by isolating task information to specific clusters of nodes and connections (functional modules. While the prior work did show that modular ANNs suffered less from catastrophic forgetting, it was not able to produce ANNs that possessed task-specific functional modules, thereby leaving the main theory regarding modularity and forgetting untested. We introduce diffusion-based neuromodulation, which simulates the release of diffusing, neuromodulatory chemicals within an ANN that can modulate (i.e. up or down regulate learning in a spatial region. On the simple diagnostic problem from the prior work, diffusion-based neuromodulation 1 induces task-specific learning in groups of nodes and connections (task-specific localized learning, which 2 produces functional modules for each subtask, and 3 yields higher performance by eliminating catastrophic forgetting. Overall, our results suggest that diffusion-based neuromodulation promotes task-specific localized learning and functional modularity, which can help solve the challenging, but important problem of catastrophic forgetting.

  2. Geostatistical simulation of geological architecture and uncertainty propagation in groundwater modeling

    DEFF Research Database (Denmark)

    He, Xiulan

    parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...... be compensated by model parameters, e.g. when hydraulic heads are considered. However, geological structure is the primary source of uncertainty with respect to simulations of groundwater age and capture zone. Operational MPS based software has been on stage for just around ten years; yet, issues regarding...... geological structures of these three sites provided appropriate conditions for testing the methods. Our study documented that MPS is an efficient approach for simulating geological heterogeneity, especially for non-stationary system. The high resolution of geophysical data such as SkyTEM is valuable both...

  3. Investigation of hydrometeor classification uncertainties through the POLARRIS polarimetric radar simulator

    Science.gov (United States)

    Dolan, B.; Rutledge, S. A.; Barnum, J. I.; Matsui, T.; Tao, W. K.; Iguchi, T.

    2017-12-01

    POLarimetric Radar Retrieval and Instrument Simulator (POLARRIS) is a framework that has been developed to simulate radar observations from cloud resolving model (CRM) output and subject model data and observations to the same retrievals, analysis and visualization. This framework not only enables validation of bulk microphysical model simulated properties, but also offers an opportunity to study the uncertainties associated with retrievals such as hydrometeor classification (HID). For the CSU HID, membership beta functions (MBFs) are built using a set of simulations with realistic microphysical assumptions about axis ratio, density, canting angles, size distributions for each of ten hydrometeor species. These assumptions are tested using POLARRIS to understand their influence on the resulting simulated polarimetric data and final HID classification. Several of these parameters (density, size distributions) are set by the model microphysics, and therefore the specific assumptions of axis ratio and canting angle are carefully studied. Through these sensitivity studies, we hope to be able to provide uncertainties in retrieved polarimetric variables and HID as applied to CRM output. HID retrievals assign a classification to each point by determining the highest score, thereby identifying the dominant hydrometeor type within a volume. However, in nature, there is rarely just one a single hydrometeor type at a particular point. Models allow for mixing ratios of different hydrometeors within a grid point. We use the mixing ratios from CRM output in concert with the HID scores and classifications to understand how the HID algorithm can provide information about mixtures within a volume, as well as calculate a confidence in the classifications. We leverage the POLARRIS framework to additionally probe radar wavelength differences toward the possibility of a multi-wavelength HID which could utilize the strengths of different wavelengths to improve HID classifications. With

  4. Quantifying uncertainty in Transcranial Magnetic Stimulation - A high resolution simulation study in ICBM space.

    Science.gov (United States)

    Toschi, Nicola; Keck, Martin E; Welt, Tobias; Guerrisi, Maria

    2012-01-01

    Transcranial Magnetic Stimulation offers enormous potential for noninvasive brain stimulation. While it is known that brain tissue significantly "reshapes" induced field and charge distributions, most modeling investigations to-date have focused on single-subject data with limited generality. Further, the effects of the significant uncertainties which exist in the simulation (i.e. brain conductivity distributions) and stimulation (e.g. coil positioning and orientations) setup have not been quantified. In this study, we construct a high-resolution anisotropic head model in standard ICBM space, which can be used as a population-representative standard for bioelectromagnetic simulations. Further, we employ Monte-Carlo simulations in order to quantify how uncertainties in conductivity values propagate all the way to induced field and currents, demonstrating significant, regionally dependent dispersions in values which are commonly assumed "ground truth". This framework can be leveraged in order to quantify the effect of any type of uncertainty in noninvasive brain stimulation and bears relevance in all applications of TMS, both investigative and therapeutic.

  5. Monte Carlo Simulation of Influence of Input Parameters Uncertainty on Output Data

    International Nuclear Information System (INIS)

    Sobek, Lukas

    2010-01-01

    Input parameters of a complex system in the probabilistic simulation are treated by means of probability density function (PDF). The result of the simulation have also probabilistic character. Monte Carlo simulation is widely used to obtain predictions concerning the probability of the risk. The Monte Carlo method was performed to calculate histograms of PDF for release rate given by uncertainty in distribution coefficient of radionuclides 135 Cs and 235 U.

  6. Analysis of actuator delay and its effect on uncertainty quantification for real-time hybrid simulation

    Science.gov (United States)

    Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai

    2017-10-01

    Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.

  7. Coping with ecological catastrophe: crossing major thresholds

    Directory of Open Access Journals (Sweden)

    John Cairns, Jr.

    2004-08-01

    Full Text Available The combination of human population growth and resource depletion makes catastrophes highly probable. No long-term solutions to the problems of humankind will be discovered unless sustainable use of the planet is achieved. The essential first step toward this goal is avoiding or coping with global catastrophes that result from crossing major ecological thresholds. Decreasing the number of global catastrophes will reduce the risks associated with destabilizing ecological systems, which could, in turn, destabilize societal systems. Many catastrophes will be local, regional, or national, but even these upheavals will have global consequences. Catastrophes will be the result of unsustainable practices and the misuse of technology. However, avoiding ecological catastrophes will depend on the development of eco-ethics, which is subject to progressive maturation, comments, and criticism. Some illustrative catastrophes have been selected to display some preliminary issues of eco-ethics.

  8. Quantum catastrophe of slow light

    OpenAIRE

    Leonhardt, Ulf

    2001-01-01

    Catastrophes are at the heart of many fascinating optical phenomena. The rainbow, for example, is a ray catastrophe where light rays become infinitely intense. The wave nature of light resolves the infinities of ray catastrophes while drawing delicate interference patterns such as the supernumerary arcs of the rainbow. Black holes cause wave singularities. Waves oscillate with infinitely small wave lengths at the event horizon where time stands still. The quantum nature of light avoids this h...

  9. Range uncertainties in proton therapy and the role of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Paganetti, Harald

    2012-01-01

    The main advantages of proton therapy are the reduced total energy deposited in the patient as compared to photon techniques and the finite range of the proton beam. The latter adds an additional degree of freedom to treatment planning. The range in tissue is associated with considerable uncertainties caused by imaging, patient setup, beam delivery and dose calculation. Reducing the uncertainties would allow a reduction of the treatment volume and thus allow a better utilization of the advantages of protons. This paper summarizes the role of Monte Carlo simulations when aiming at a reduction of range uncertainties in proton therapy. Differences in dose calculation when comparing Monte Carlo with analytical algorithms are analyzed as well as range uncertainties due to material constants and CT conversion. Range uncertainties due to biological effects and the role of Monte Carlo for in vivo range verification are discussed. Furthermore, the current range uncertainty recipes used at several proton therapy facilities are revisited. We conclude that a significant impact of Monte Carlo dose calculation can be expected in complex geometries where local range uncertainties due to multiple Coulomb scattering will reduce the accuracy of analytical algorithms. In these cases Monte Carlo techniques might reduce the range uncertainty by several mm. (topical review)

  10. Efficient uncertainty quantification in fully-integrated surface and subsurface hydrologic simulations

    Science.gov (United States)

    Miller, K. L.; Berg, S. J.; Davison, J. H.; Sudicky, E. A.; Forsyth, P. A.

    2018-01-01

    Although high performance computers and advanced numerical methods have made the application of fully-integrated surface and subsurface flow and transport models such as HydroGeoSphere common place, run times for large complex basin models can still be on the order of days to weeks, thus, limiting the usefulness of traditional workhorse algorithms for uncertainty quantification (UQ) such as Latin Hypercube simulation (LHS) or Monte Carlo simulation (MCS), which generally require thousands of simulations to achieve an acceptable level of accuracy. In this paper we investigate non-intrusive polynomial chaos for uncertainty quantification, which in contrast to random sampling methods (e.g., LHS and MCS), represents a model response of interest as a weighted sum of polynomials over the random inputs. Once a chaos expansion has been constructed, approximating the mean, covariance, probability density function, cumulative distribution function, and other common statistics as well as local and global sensitivity measures is straightforward and computationally inexpensive, thus making PCE an attractive UQ method for hydrologic models with long run times. Our polynomial chaos implementation was validated through comparison with analytical solutions as well as solutions obtained via LHS for simple numerical problems. It was then used to quantify parametric uncertainty in a series of numerical problems with increasing complexity, including a two-dimensional fully-saturated, steady flow and transient transport problem with six uncertain parameters and one quantity of interest; a one-dimensional variably-saturated column test involving transient flow and transport, four uncertain parameters, and two quantities of interest at 101 spatial locations and five different times each (1010 total); and a three-dimensional fully-integrated surface and subsurface flow and transport problem for a small test catchment involving seven uncertain parameters and three quantities of interest at

  11. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  12. Evaluation and uncertainties of global climate models as simulated in East Asia and China

    International Nuclear Information System (INIS)

    Zhao, Z.C.

    1994-01-01

    The assessments and uncertainties of the general circulation models (GCMs) as simulated in East Asia and China (15-60 N, 70-140 E) have been investigated by using seven GCMs. Four methods of assessment have been chosen. The variables for the validations for the GCMs include the annual, seasonal and monthly mean temperatures and precipitation. The assessments indicated that: (1) the simulations of seven GCMs for temperature are much better than those for precipitation; (2) the simulations in winter are much better than those in summer; (3) the simulations in eastern parts are much better than those in Western parts for both temperature and precipitation; (4) the best GCM for simulated temperature is the GISS model, and the best GCM for simulated precipitation is the UKMO-H model. The seven GCMs' means for both simulated temperature and precipitation provided good results. The range of uncertainties in East Asia and China due to human activities are presented. The differences between the GCMs for temperature and precipitation before the year 2050 are much smaller than those after the year 2050

  13. Using sequential indicator simulation to assess the uncertainty of delineating heavy-metal contaminated soils

    International Nuclear Information System (INIS)

    Juang, Kai-Wei; Chen, Yue-Shin; Lee, Dar-Yuan

    2004-01-01

    Mapping the spatial distribution of soil pollutants is essential for delineating contaminated areas. Currently, geostatistical interpolation, kriging, is increasingly used to estimate pollutant concentrations in soils. The kriging-based approach, indicator kriging (IK), may be used to model the uncertainty of mapping. However, a smoothing effect is usually produced when using kriging in pollutant mapping. The detailed spatial patterns of pollutants could, therefore, be lost. The local uncertainty of mapping pollutants derived by the IK technique is referred to as the conditional cumulative distribution function (ccdf) for one specific location (i.e. single-location uncertainty). The local uncertainty information obtained by IK is not sufficient as the uncertainty of mapping at several locations simultaneously (i.e. multi-location uncertainty or spatial uncertainty) is required to assess the reliability of the delineation of contaminated areas. The simulation approach, sequential indicator simulation (SIS), which has the ability to model not only single, but also multi-location uncertainties, was used, in this study, to assess the uncertainty of the delineation of heavy metal contaminated soils. To illustrate this, a data set of Cu concentrations in soil from Taiwan was used. The results show that contour maps of Cu concentrations generated by the SIS realizations exhausted all the spatial patterns of Cu concentrations without the smoothing effect found when using the kriging method. Based on the SIS realizations, the local uncertainty of Cu concentrations at a specific location of x', refers to the probability of the Cu concentration z(x') being higher than the defined threshold level of contamination (z c ). This can be written as Prob SIS [z(x')>z c ], representing the probability of contamination. The probability map of Prob SIS [z(x')>z c ] can then be used for delineating contaminated areas. In addition, the multi-location uncertainty of an area A

  14. Stochastic Catastrophe Analysis of Strategic Alliances' Coopetition Including Simulations%战略联盟竞合行为的随机突变分析与仿真

    Institute of Scientific and Technical Information of China (English)

    徐岩; 胡斌

    2012-01-01

    The evolution process of partners' strategies in strategic alliances with multi-firm was considered by evolutionary game theory perspective. A deterministic dynamical equation is developed, based on which, the Gaussian White noise is introduced to show the disturbance, and a stochastic dynamical equation is created. The catastrophe of strategic alliances that ranges cooperation to betrayal in the process is analyzed by means of stochastic catastrophe theory. The catastrophe set of control variables is found to explain and forecast the catastrophe of strategic alliances. To validate the correctness of the model, some numerical simulations are given in different scenarios, and it is evident from the illustrations that the behavior of the strategic alliances encounters catastrophe near the catastrophe set.%针对多成员战略联盟在不确定环境下策略的演化过程,借助演化博弈论建立了含有白噪声的随机动力学.利用随机突变理论来分析在不确定性条件下,联盟成员行为(竞争或合作)随着参数的连续变化在整体上发生突变的问题,给出了联盟发生突变的临界集,以此来解释和预测在不确定性环境下,战略联盟发生非计划性解体或者合作失败的突发性问题.对不同场景下的模型进行了数值仿真,结果表明,在临界集附近,联盟集体的行为发生了突变.

  15. ARIANNE. Analytical uncertainties. Simulation of influential factors in the inventory of the final web cam

    International Nuclear Information System (INIS)

    Morales Prieto, M.; Ortega Saiz, P.

    2011-01-01

    Analysis of analytical uncertainties of the methodology of simulation of processes for obtaining isotopic ending inventory of spent fuel, the ARIANE experiment explores the part of simulation of burning.

  16. Catastrophizing in Patients with Burning Mouth Syndrome

    Directory of Open Access Journals (Sweden)

    Ana ANDABAK ROGULJ

    2014-01-01

    Full Text Available Background: Burning mouth syndrome (BMS is an idiopathic painful condition which manifests with burning sensations in the oral cavity in patients with clinically normal oral mucosa and without any local and/or systemic causative factor. Catastrophizing is defined as an exaggerated negative orientation toward pain stimuli and pain experience. The aim of this study was to examine the association between catastrophizing and clinical parameters of BMS, and to examine the association between catastrophizing and the quality of life in patients with BMS. Materials and methods: Anonymous questionnaire consisting of 3 parts (demographic and clinical data with 100 mm visual analogue scale (VAS, Croatian version of the Oral Health Impact Profile (OHIP-14 scale and Croatian version of the Pain Catastrophizing scale (PC, was distributed to 30 patients diagnosed with BMS. Results: A higher level of catastrophizing was clinically significant in 30% of the patients. Total catastrophizing score and all three subcomponents of catastrophizing significantly correlated with the intensity of symptoms, but did not correlate with the duration of symptoms. Gender and previous treatment did not affect the catastrophizing. Conclusion: Obtaining the information about catastrophizing could help a clinician to identify patients with negative behavioural patterns. Additional psychological intervention in these individuals could reduce/eliminate negative cognitive factors and improve coping with chronic painful condition such as BMS.

  17. Application of Catastrophe Risk Modelling to Evacuation Public Policy

    Science.gov (United States)

    Woo, G.

    2009-04-01

    The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a

  18. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  19. Evaluating uncertainties in regional climate simulations over South America at the seasonal scale

    Energy Technology Data Exchange (ETDEWEB)

    Solman, Silvina A. [Centro de Investigaciones del Mar y la Atmosfera CIMA/CONICET-UBA, DCAO/FCEN, UMI-IFAECI/CNRS, CIMA-Ciudad Universitaria, Buenos Aires (Argentina); Pessacg, Natalia L. [Centro Nacional Patagonico (CONICET), Puerto Madryn, Chubut (Argentina)

    2012-07-15

    This work focuses on the evaluation of different sources of uncertainty affecting regional climate simulations over South America at the seasonal scale, using the MM5 model. The simulations cover a 3-month period for the austral spring season. Several four-member ensembles were performed in order to quantify the uncertainty due to: the internal variability; the definition of the regional model domain; the choice of physical parameterizations and the selection of physical parameters within a particular cumulus scheme. The uncertainty was measured by means of the spread among individual members of each ensemble during the integration period. Results show that the internal variability, triggered by differences in the initial conditions, represents the lowest level of uncertainty for every variable analyzed. The geographic distribution of the spread among ensemble members depends on the variable: for precipitation and temperature the largest spread is found over tropical South America while for the mean sea level pressure the largest spread is located over the southeastern Atlantic Ocean, where large synoptic-scale activity occurs. Using nudging techniques to ingest the boundary conditions reduces dramatically the internal variability. The uncertainty due to the domain choice displays a similar spatial pattern compared with the internal variability, except for the mean sea level pressure field, though its magnitude is larger all over the model domain for every variable. The largest spread among ensemble members is found for the ensemble in which different combinations of physical parameterizations are selected. The perturbed physics ensemble produces a level of uncertainty slightly larger than the internal variability. This study suggests that no matter what the source of uncertainty is, the geographical distribution of the spread among members of the ensembles is invariant, particularly for precipitation and temperature. (orig.)

  20. Adaptation to and Recovery from Global Catastrophe

    Directory of Open Access Journals (Sweden)

    Seth D. Baum

    2013-03-01

    Full Text Available Global catastrophes, such as nuclear war, pandemics and ecological collapse threaten the sustainability of human civilization. To date, most work on global catastrophes has focused on preventing the catastrophes, neglecting what happens to any catastrophe survivors. To address this gap in the literature, this paper discusses adaptation to and recovery from global catastrophe. The paper begins by discussing the importance of global catastrophe adaptation and recovery, noting that successful adaptation/recovery could have value on even astronomical scales. The paper then discusses how the adaptation/recovery could proceed and makes connections to several lines of research. Research on resilience theory is considered in detail and used to develop a new method for analyzing the environmental and social stressors that global catastrophe survivors would face. This method can help identify options for increasing survivor resilience and promoting successful adaptation and recovery. A key point is that survivors may exist in small isolated communities disconnected from global trade and, thus, must be able to survive and rebuild on their own. Understanding the conditions facing isolated survivors can help promote successful adaptation and recovery. That said, the processes of global catastrophe adaptation and recovery are highly complex and uncertain; further research would be of great value.

  1. Zeeman catastrophe machines as a toolkit for teaching chaos

    International Nuclear Information System (INIS)

    Nagy, Péter; Tasnádi, Péter

    2014-01-01

    The investigation of chaotic motions and cooperative systems offers a magnificent opportunity to involve modern physics in the basic course of mechanics taught to engineering students. In this paper, it will be demonstrated that the Zeeman machine can be a versatile and motivating tool for students to acquire introductory knowledge about chaotic motion via interactive simulations. The Zeeman catastrophe machine is a typical example of a quasi-static system with hysteresis. It works in a relatively simple way and its properties can be understood very easily. Since the machine can be built easily and the simulation of its movement is also simple, the experimental investigation and the theoretical description can be connected intuitively. Although the Zeeman machine is known mainly for its quasi-static and catastrophic behaviour, its dynamic properties are also of interest with its typical chaotic features. By means of a periodically driven Zeeman machine, a wide range of chaotic properties of the simple systems can be demonstrated, such as bifurcation diagrams, chaotic attractors, transient chaos, Lyapunov exponents and so on. This paper is organically linked to our website (http://csodafizika.hu/zeeman) where the discussed simulation programs can be downloaded. In a second paper, the novel construction of a network of Zeeman machines will be presented to study the properties of cooperative systems. (paper)

  2. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  3. Replication Catastrophe

    DEFF Research Database (Denmark)

    Toledo, Luis; Neelsen, Kai John; Lukas, Jiri

    2017-01-01

    Proliferating cells rely on the so-called DNA replication checkpoint to ensure orderly completion of genome duplication, and its malfunction may lead to catastrophic genome disruption, including unscheduled firing of replication origins, stalling and collapse of replication forks, massive DNA...... breakage, and, ultimately, cell death. Despite many years of intensive research into the molecular underpinnings of the eukaryotic replication checkpoint, the mechanisms underlying the dismal consequences of its failure remain enigmatic. A recent development offers a unifying model in which the replication...... checkpoint guards against global exhaustion of rate-limiting replication regulators. Here we discuss how such a mechanism can prevent catastrophic genome disruption and suggest how to harness this knowledge to advance therapeutic strategies to eliminate cancer cells that inherently proliferate under...

  4. GDP-to-GTP exchange on the microtubule end can contribute to the frequency of catastrophe.

    Science.gov (United States)

    Piedra, Felipe-Andrés; Kim, Tae; Garza, Emily S; Geyer, Elisabeth A; Burns, Alexander; Ye, Xuecheng; Rice, Luke M

    2016-11-07

    Microtubules are dynamic polymers of αβ-tubulin that have essential roles in chromosome segregation and organization of the cytoplasm. Catastrophe-the switch from growing to shrinking-occurs when a microtubule loses its stabilizing GTP cap. Recent evidence indicates that the nucleotide on the microtubule end controls how tightly an incoming subunit will be bound (trans-acting GTP), but most current models do not incorporate this information. We implemented trans-acting GTP into a computational model for microtubule dynamics. In simulations, growing microtubules often exposed terminal GDP-bound subunits without undergoing catastrophe. Transient GDP exposure on the growing plus end slowed elongation by reducing the number of favorable binding sites on the microtubule end. Slower elongation led to erosion of the GTP cap and an increase in the frequency of catastrophe. Allowing GDP-to-GTP exchange on terminal subunits in simulations mitigated these effects. Using mutant αβ-tubulin or modified GTP, we showed experimentally that a more readily exchangeable nucleotide led to less frequent catastrophe. Current models for microtubule dynamics do not account for GDP-to-GTP exchange on the growing microtubule end, so our findings provide a new way of thinking about the molecular events that initiate catastrophe. © 2016 Piedra et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  5. Using statistical model to simulate the impact of climate change on maize yield with climate and crop uncertainties

    Science.gov (United States)

    Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining

    2017-11-01

    Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.

  6. Numerical Simulations Of Catastrophic Disruption Of Porous Bodies: Application To Dark-type Asteroids And Kuiper-belt Family Formation

    Science.gov (United States)

    Michel, Patrick; Jutzi, M.; Richardson, D. C.; Benz, W.

    2010-10-01

    Asteroids of dark (e.g. C, D) taxonomic classes as well as Kuiper Belt objects and comets are believed to have high porosity, not only in the form of large voids but also in the form of micro-pores. The presence of such microscale porosity introduces additional physics in the impact process. We have enhanced our 3D SPH hydrocode, used to simulate catastrophic breakups, with a model of porosity [1] and validated it at small scale by comparison with impact experiments on pumice targets [2]. Our model is now ready to be applied to a large range of problems. In particular, accounting for the gravitational phase of an impact, we can study the formation of dark-type asteroid families, such as Veritas, and Kuiper-Belt families, such as Haumea. Recently we characterized for the first time the catastrophic impact energy threshold, usually called Q*D, as a function of the target's diameter, porosity, material strength and impact speed [3]. Regarding the mentioned families, our preliminary results show that accounting for porosity leads to different outcomes that may better represent their properties and constrain their definition. In particular, for Veritas, we find that its membership may need some revision [4]. The parameter space is still large, many interesting families need to be investigated and our model will be applied to a large range of cases. PM, MJ and DCR acknowledge financial support from the French Programme National de Planétologie, NASA PG&G "Small Bodies and Planetary Collisions" and NASA under Grant No. NNX08AM39G issued through the Office of Space Science, respectively. [1] Jutzi et al. 2008. Icarus 198, 242-255; [2] Jutzi et al. 2009. Icarus 201, 802-813; [3] Jutzi et al. 2010. Fragment properties at the catastrophic disruption threshold: The effect of the parent body's internal structure, Icarus 207, 54-65; [4] Michel et al. 2010. Icarus, submitted.

  7. Theory of a slow-light catastrophe

    International Nuclear Information System (INIS)

    Leonhardt, Ulf

    2002-01-01

    In diffraction catastrophes such as the rainbow, the wave nature of light resolves ray singularities and draws delicate interference patterns. In quantum catastrophes such as the black hole, the quantum nature of light resolves wave singularities and creates characteristic quantum effects related to Hawking radiation. This paper describes the theory behind a recent proposal [U. Leonhardt, Nature (London) 415, 406 (2002)] to generate a quantum catastrophe of slow light

  8. Theory of a slow-light catastrophe

    Science.gov (United States)

    Leonhardt, Ulf

    2002-04-01

    In diffraction catastrophes such as the rainbow, the wave nature of light resolves ray singularities and draws delicate interference patterns. In quantum catastrophes such as the black hole, the quantum nature of light resolves wave singularities and creates characteristic quantum effects related to Hawking radiation. This paper describes the theory behind a recent proposal [U. Leonhardt, Nature (London) 415, 406 (2002)] to generate a quantum catastrophe of slow light.

  9. Theory of a Slow-Light Catastrophe

    OpenAIRE

    Leonhardt, Ulf

    2001-01-01

    In diffraction catastrophes such as the rainbow the wave nature of light resolves ray singularities and draws delicate interference patterns. In quantum catastrophes such as the black hole the quantum nature of light resolves wave singularities and creates characteristic quantum effects related to Hawking radiation. The paper describes the theory behind a recent proposal [U. Leonhardt, arXiv:physics/0111058, Nature (in press)] to generate a quantum catastrophe of slow light.

  10. Catastrophe theory with application in nuclear technology

    International Nuclear Information System (INIS)

    Valeca, Serban Constantin

    2002-01-01

    The monograph is structured on the following seven chapters: 1. Correlation of risk, catastrophe and chaos at the level of polyfunctional systems with nuclear injection; 1.1 Approaching the risk at the level of power systems; 1.2 Modelling the chaos-catastrophe-risk correlation in the structure of integrated classical and nuclear processes; 2. Catastrophe theory applied in ecosystems models and applications; 2.1 Posing the problems in catastrophe theory; 2.2 Application of catastrophe theory in the engineering of the power ecosystems with nuclear injection; 4.. Decision of abatement of the catastrophic risk based on minimal costs; 4.1 The nuclear power systems sensitive to risk-catastrophe-chaos in the structure of minimal costs; 4.2 Evaluating the market structure on the basis of power minimal costs; 4.3 Decisions in power systems built on minimal costs; 5. Models of computing the minimal costs in classical and nuclear power systems; 5.1 Calculation methodologies of power minimal cost; 5.2 Calculation methods of minimal costs in nuclear power sector; 6. Expert and neuro expert systems for supervising the risk-catastrophe-chaos correlation; 6.1 The structure of expert systems; 6.2 Application of the neuro expert program; 7. Conclusions and operational proposals; 7.1 A synthesis of the problems presented in this work; 7.2 Highlighting the novel aspects applicable in the power systems with nuclear injection

  11. Uncertainty estimation and ensemble forecast with a chemistry-transport model - Application to air-quality modeling and simulation

    International Nuclear Information System (INIS)

    Mallet, Vivien

    2005-01-01

    The thesis deals with the evaluation of a chemistry-transport model, not primarily with classical comparisons to observations, but through the estimation of its a priori uncertainties due to input data, model formulation and numerical approximations. These three uncertainty sources are studied respectively on the basis of Monte Carlos simulations, multi-models simulations and numerical schemes inter-comparisons. A high uncertainty is found, in output ozone concentrations. In order to overtake the limitations due to the uncertainty, a solution is ensemble forecast. Through combinations of several models (up to forty-eight models) on the basis of past observations, the forecast can be significantly improved. The achievement of this work has also led to develop the innovative modelling-system Polyphemus. (author) [fr

  12. Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows

    Energy Technology Data Exchange (ETDEWEB)

    Templeton, Jeremy Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Blaylock, Myra L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Domino, Stefan P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hewson, John C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kumar, Pritvi Raj [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ling, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Najm, Habib N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ruiz, Anthony [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stewart, Alessia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wagner, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.

  13. Validation/Uncertainty Quantification for Large Eddy Simulations of the heat flux in the Tangentially Fired Oxy-Coal Alstom Boiler Simulation Facility

    Energy Technology Data Exchange (ETDEWEB)

    Smith, P.J.; Eddings, E.G.; Ring, T.; Thornock, J.; Draper, T.; Isaac, B.; Rezeai, D.; Toth, P.; Wu, Y.; Kelly, K.

    2014-08-01

    The objective of this task is to produce predictive capability with quantified uncertainty bounds for the heat flux in commercial-scale, tangentially fired, oxy-coal boilers. Validation data came from the Alstom Boiler Simulation Facility (BSF) for tangentially fired, oxy-coal operation. This task brings together experimental data collected under Alstom’s DOE project for measuring oxy-firing performance parameters in the BSF with this University of Utah project for large eddy simulation (LES) and validation/uncertainty quantification (V/UQ). The Utah work includes V/UQ with measurements in the single-burner facility where advanced strategies for O2 injection can be more easily controlled and data more easily obtained. Highlights of the work include: • Simulations of Alstom’s 15 megawatt (MW) BSF, exploring the uncertainty in thermal boundary conditions. A V/UQ analysis showed consistency between experimental results and simulation results, identifying uncertainty bounds on the quantities of interest for this system (Subtask 9.1) • A simulation study of the University of Utah’s oxy-fuel combustor (OFC) focused on heat flux (Subtask 9.2). A V/UQ analysis was used to show consistency between experimental and simulation results. • Measurement of heat flux and temperature with new optical diagnostic techniques and comparison with conventional measurements (Subtask 9.3). Various optical diagnostics systems were created to provide experimental data to the simulation team. The final configuration utilized a mid-wave infrared (MWIR) camera to measure heat flux and temperature, which was synchronized with a high-speed, visible camera to utilize two-color pyrometry to measure temperature and soot concentration. • Collection of heat flux and temperature measurements in the University of Utah’s OFC for use is subtasks 9.2 and 9.3 (Subtask 9.4). Several replicates were carried to better assess the experimental error. Experiments were specifically designed for the

  14. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    Science.gov (United States)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  15. How are the catastrophical risks quantifiable

    International Nuclear Information System (INIS)

    Chakraborty, S.

    1985-01-01

    For the assessment and evaluation of industrial risks the question must be asked how are the catastrophical risks quantifiable. Typical real catastrophical risks and risk assessment based on modelling assumptions have been placed against each other in order to put the risks into proper perspective. However, the society is risk averse when there is a catastrophic potential of severe accidents in a large scale industrial facility even though there is extremely low probability of occurence. (orig.) [de

  16. Constraining Parameter Uncertainty in Simulations of Water and Heat Dynamics in Seasonally Frozen Soil Using Limited Observed Data

    Directory of Open Access Journals (Sweden)

    Mousong Wu

    2016-02-01

    Full Text Available Water and energy processes in frozen soils are important for better understanding hydrologic processes and water resources management in cold regions. To investigate the water and energy balance in seasonally frozen soils, CoupModel combined with the generalized likelihood uncertainty estimation (GLUE method was used. Simulation work on water and heat processes in frozen soil in northern China during the 2012/2013 winter was conducted. Ensemble simulations through the Monte Carlo sampling method were generated for uncertainty analysis. Behavioral simulations were selected based on combinations of multiple model performance index criteria with respect to simulated soil water and temperature at four depths (5 cm, 15 cm, 25 cm, and 35 cm. Posterior distributions for parameters related to soil hydraulic, radiation processes, and heat transport indicated that uncertainties in both input and model structures could influence model performance in modeling water and heat processes in seasonally frozen soils. Seasonal courses in water and energy partitioning were obvious during the winter. Within the day-cycle, soil evaporation/condensation and energy distributions were well captured and clarified as an important phenomenon in the dynamics of the energy balance system. The combination of the CoupModel simulations with the uncertainty-based calibration method provides a way of understanding the seasonal courses of hydrology and energy processes in cold regions with limited data. Additional measurements may be used to further reduce the uncertainty of regulating factors during the different stages of freezing–thawing.

  17. Environmental catastrophes under time-inconsistent preference

    Energy Technology Data Exchange (ETDEWEB)

    Michielsen, T.

    2013-02-15

    I analyze optimal natural resource use in an intergenerational model with the risk of a catastrophe. Each generation maximizes a weighted sum of discounted utility (positive) and the probability that a catastrophe will occur at any point in the future (negative). The model generates time inconsistency as generations disagree on the relative weights on utility and catastrophe prevention. As a consequence, future generations emit too much from the current generation's perspective and a dynamic game ensues. I consider a sequence of models. When the environmental problem is related to a scarce exhaustible resource, early generations have an incentive to reduce emissions in Markov equilibrium in order to enhance the ecosystem's resilience to future emissions. When the pollutant is expected to become obsolete in the near future, early generations may however increase their emissions if this reduces future emissions. When polluting inputs are abundant and expected to remain essential, the catastrophe becomes a self-fulfilling prophecy and the degree of concern for catastrophe prevention has limited or even no effect on equilibrium behaviour.

  18. Multi-fidelity numerical simulations of shock/turbulent-boundary layer interaction with uncertainty quantification

    Science.gov (United States)

    Bermejo-Moreno, Ivan; Campo, Laura; Larsson, Johan; Emory, Mike; Bodart, Julien; Palacios, Francisco; Iaccarino, Gianluca; Eaton, John

    2013-11-01

    We study the interaction between an oblique shock wave and the turbulent boundary layers inside a nearly-square duct by combining wall-modeled LES, 2D and 3D RANS simulations, targeting the experiment of Campo, Helmer & Eaton, 2012 (nominal conditions: M = 2 . 05 , Reθ = 6 , 500). A primary objective is to quantify the effect of aleatory and epistemic uncertainties on the STBLI. Aleatory uncertainties considered include the inflow conditions (Mach number of the incoming air stream and thickness of the boundary layers) and perturbations of the duct geometry upstream of the interaction. The epistemic uncertainty under consideration focuses on the RANS turbulence model form by injecting perturbations in the Reynolds stress anisotropy in regions of the flow where the model assumptions (in particular, the Boussinesq eddy-viscosity hypothesis) may be invalid. These perturbations are then propagated through the flow solver into the solution. The uncertainty quantification (UQ) analysis is done through 2D and 3D RANS simulations, assessing the importance of the three-dimensional effects imposed by the nearly-square duct geometry. Wall-modeled LES are used to verify elements of the UQ methodology and to explore the flow features and physics of the STBLI for multiple shock strengths. Financial support from the United States Department of Energy under the PSAAP program is gratefully acknowledged.

  19. On the governance of global and catastrophic risks

    DEFF Research Database (Denmark)

    Faber, Michael Havbro

    2011-01-01

    The focus of the present paper regards the identification and treatment of critical issues in the process of societal decision making concerning management of global and catastrophic risks. Taking basis in recent works by the author, the paper in particular addresses: 1) Which are the most relevant...... hazards in a holistic global perspective and how may these be categorised in view of strategies for their treatment?; 2) How might robust societal decisions on risk management subject to large uncertainties be formally supported?; 3) How may available economic resources be prioritised for the purpose...... of sustainable and global life safety and health improvements? Finally, new results and perspectives are presented on the issue of allocation of resources for the purpose of improving global public health and a discussion on global risk governance concludes the paper....

  20. Catastrophe medicine; Medecine de catastrophe

    Energy Technology Data Exchange (ETDEWEB)

    Lebreton, A. [Service Technique de l`Energie Electrique et des Grands Barrages (STEEGB), (France)

    1996-12-31

    The `Catastrophe Medicine` congress which took place in Amiens (France) in December 5 to 7 1996 was devoted to the assessment and management of risks and hazards in natural and artificial systems. The methods of risk evaluation and prevision were discussed in the context of dams accidents with the analysis of experience feedbacks and lessons gained from the organisation of emergency plans. Three round table conferences were devoted to the importance of psychological aspects during such major crises. (J.S.)

  1. Axial and focal-plane diffraction catastrophe integrals

    International Nuclear Information System (INIS)

    Berry, M V; Howls, C J

    2010-01-01

    Exact expressions in terms of Bessel functions are found for some of the diffraction catastrophe integrals that decorate caustics in optics and mechanics. These are the axial and focal-plane sections of the elliptic and hyperbolic umbilic diffraction catastrophes, and symmetric elliptic and hyperbolic unfoldings of the X 9 diffraction catastrophes. These representations reveal unexpected relations between the integrals.

  2. Catastrophic Disruption Threshold and Maximum Deflection from Kinetic Impact

    Science.gov (United States)

    Cheng, A. F.

    2017-12-01

    The use of a kinetic impactor to deflect an asteroid on a collision course with Earth was described in the NASA Near-Earth Object Survey and Deflection Analysis of Alternatives (2007) as the most mature approach for asteroid deflection and mitigation. The NASA DART mission will demonstrate asteroid deflection by kinetic impact at the Potentially Hazardous Asteroid 65803 Didymos in October, 2022. The kinetic impactor approach is considered to be applicable with warning times of 10 years or more and with hazardous asteroid diameters of 400 m or less. In principle, a larger kinetic impactor bringing greater kinetic energy could cause a larger deflection, but input of excessive kinetic energy will cause catastrophic disruption of the target, leaving possibly large fragments still on collision course with Earth. Thus the catastrophic disruption threshold limits the maximum deflection from a kinetic impactor. An often-cited rule of thumb states that the maximum deflection is 0.1 times the escape velocity before the target will be disrupted. It turns out this rule of thumb does not work well. A comparison to numerical simulation results shows that a similar rule applies in the gravity limit, for large targets more than 300 m, where the maximum deflection is roughly the escape velocity at momentum enhancement factor β=2. In the gravity limit, the rule of thumb corresponds to pure momentum coupling (μ=1/3), but simulations find a slightly different scaling μ=0.43. In the smaller target size range that kinetic impactors would apply to, the catastrophic disruption limit is strength-controlled. A DART-like impactor won't disrupt any target asteroid down to significantly smaller size than the 50 m below which a hazardous object would not penetrate the atmosphere in any case unless it is unusually strong.

  3. Well below 2 °C: Mitigation strategies for avoiding dangerous to catastrophic climate changes

    Science.gov (United States)

    Xu, Yangyang; Ramanathan, Veerabhadran

    2017-09-01

    The historic Paris Agreement calls for limiting global temperature rise to “well below 2 °C.” Because of uncertainties in emission scenarios, climate, and carbon cycle feedback, we interpret the Paris Agreement in terms of three climate risk categories and bring in considerations of low-probability (5%) high-impact (LPHI) warming in addition to the central (˜50% probability) value. The current risk category of dangerous warming is extended to more categories, which are defined by us here as follows: >1.5 °C as dangerous; >3 °C as catastrophic; and >5 °C as unknown, implying beyond catastrophic, including existential threats. With unchecked emissions, the central warming can reach the dangerous level within three decades, with the LPHI warming becoming catastrophic by 2050. We outline a three-lever strategy to limit the central warming below the dangerous level and the LPHI below the catastrophic level, both in the near term (pollutant (SP) lever to mitigate short-lived climate pollutants, and the carbon extraction and sequestration (CES) lever to thin the atmospheric CO2 blanket. Pulling on both CN and SP levers and bending the emissions curve by 2020 can keep the central warming below dangerous levels. To limit the LPHI warming below dangerous levels, the CES lever must be pulled as well to extract as much as 1 trillion tons of CO2 before 2100 to both limit the preindustrial to 2100 cumulative net CO2 emissions to 2.2 trillion tons and bend the warming curve to a cooling trend.

  4. Coronal Flux Rope Catastrophe Associated With Internal Energy Release

    Science.gov (United States)

    Zhuang, Bin; Hu, Youqiu; Wang, Yuming; Zhang, Quanhao; Liu, Rui; Gou, Tingyu; Shen, Chenglong

    2018-04-01

    Magnetic energy during the catastrophe was predominantly studied by the previous catastrophe works since it is believed to be the main energy supplier for the solar eruptions. However, the contribution of other types of energies during the catastrophe cannot be neglected. This paper studies the catastrophe of the coronal flux rope system in the solar wind background, with emphasis on the transformation of different types of energies during the catastrophe. The coronal flux rope is characterized by its axial and poloidal magnetic fluxes and total mass. It is shown that a catastrophe can be triggered by not only an increase but also a decrease of the axial magnetic flux. Moreover, the internal energy of the rope is found to be released during the catastrophe so as to provide energy for the upward eruption of the flux rope. As far as the magnetic energy is concerned, it provides only part of the energy release, or even increases during the catastrophe, so the internal energy may act as the dominant or even the unique energy supplier during the catastrophe.

  5. Energy catastrophes and energy consumption

    International Nuclear Information System (INIS)

    Davis, G.

    1991-01-01

    The possibility of energy catastrophes in the production of energy serves to make estimation of the true social costs of energy production difficult. As a result, there is a distinct possibility that the private marginal cost curve of energy producers lies to the left or right of the true cost curve. If so, social welfare will not be maximized, and underconsumption or overconsumption of fuels will exist. The occurrence of energy catastrophes and observance of the market reaction to these occurrences indicates that overconsumption of energy has been the case in the past. Postulations as to market reactions to further energy catastrophes lead to the presumption that energy consumption levels remain above those that are socially optimal

  6. Comparison of the uncertainties calculated for the results of radiochemical determinations using the law of propagation of uncertainty and a Monte Carlo simulation

    International Nuclear Information System (INIS)

    Berne, A.

    2001-01-01

    Quantitative determinations of many radioactive analytes in environmental samples are based on a process in which several independent measurements of different properties are taken. The final results that are calculated using the data have to be evaluated for accuracy and precision. The estimate of the standard deviation, s, also called the combined standard uncertainty (CSU) associated with the result of this combined measurement can be used to evaluate the precision of the result. The CSU can be calculated by applying the law of propagation of uncertainty, which is based on the Taylor series expansion of the equation used to calculate the analytical result. The estimate of s can also be obtained from a Monte Carlo simulation. The data used in this simulation includes the values resulting from the individual measurements, the estimate of the variance of each value, including the type of distribution, and the equation used to calculate the analytical result. A comparison is made between these two methods of estimating the uncertainty of the calculated result. (author)

  7. An Uncertainty Structure Matrix for Models and Simulations

    Science.gov (United States)

    Green, Lawrence L.; Blattnig, Steve R.; Hemsch, Michael J.; Luckring, James M.; Tripathi, Ram K.

    2008-01-01

    Software that is used for aerospace flight control and to display information to pilots and crew is expected to be correct and credible at all times. This type of software is typically developed under strict management processes, which are intended to reduce defects in the software product. However, modeling and simulation (M&S) software may exhibit varying degrees of correctness and credibility, depending on a large and complex set of factors. These factors include its intended use, the known physics and numerical approximations within the M&S, and the referent data set against which the M&S correctness is compared. The correctness and credibility of an M&S effort is closely correlated to the uncertainty management (UM) practices that are applied to the M&S effort. This paper describes an uncertainty structure matrix for M&S, which provides a set of objective descriptions for the possible states of UM practices within a given M&S effort. The columns in the uncertainty structure matrix contain UM elements or practices that are common across most M&S efforts, and the rows describe the potential levels of achievement in each of the elements. A practitioner can quickly look at the matrix to determine where an M&S effort falls based on a common set of UM practices that are described in absolute terms that can be applied to virtually any M&S effort. The matrix can also be used to plan those steps and resources that would be needed to improve the UM practices for a given M&S effort.

  8. Assessing Fatigue and Ultimate Load Uncertainty in Floating Offshore Wind Turbines Due to Varying Simulation Length

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, G.; Lackner, M.; Haid, L.; Matha, D.; Jonkman, J.; Robertson, A.

    2013-07-01

    With the push towards siting wind turbines farther offshore due to higher wind quality and less visibility, floating offshore wind turbines, which can be located in deep water, are becoming an economically attractive option. The International Electrotechnical Commission's (IEC) 61400-3 design standard covers fixed-bottom offshore wind turbines, but there are a number of new research questions that need to be answered to modify these standards so that they are applicable to floating wind turbines. One issue is the appropriate simulation length needed for floating turbines. This paper will discuss the results from a study assessing the impact of simulation length on the ultimate and fatigue loads of the structure, and will address uncertainties associated with changing the simulation length for the analyzed floating platform. Recommendations of required simulation length based on load uncertainty will be made and compared to current simulation length requirements.

  9. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    International Nuclear Information System (INIS)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose; Ortiz, J.; Pereira, Claubia

    2013-01-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  10. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose, E-mail: sergalbe@upv.es [Universitat Politecnica de Valencia, Valencia, (Spain). Instituto de Seguridad Industrial, Radiofisica y Medioambiental (ISIRYM); Ortiz, J. [Universitat Politecnica de Valencia, Valencia, (Spain). Servicio de Radiaciones. Lab. de Radiactividad Ambiental; Pereira, Claubia [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2013-07-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  11. Assessment of groundwater level estimation uncertainty using sequential Gaussian simulation and Bayesian bootstrapping

    Science.gov (United States)

    Varouchakis, Emmanouil; Hristopulos, Dionissios

    2015-04-01

    Space-time geostatistical approaches can improve the reliability of dynamic groundwater level models in areas with limited spatial and temporal data. Space-time residual Kriging (STRK) is a reliable method for spatiotemporal interpolation that can incorporate auxiliary information. The method usually leads to an underestimation of the prediction uncertainty. The uncertainty of spatiotemporal models is usually estimated by determining the space-time Kriging variance or by means of cross validation analysis. For de-trended data the former is not usually applied when complex spatiotemporal trend functions are assigned. A Bayesian approach based on the bootstrap idea and sequential Gaussian simulation are employed to determine the uncertainty of the spatiotemporal model (trend and covariance) parameters. These stochastic modelling approaches produce multiple realizations, rank the prediction results on the basis of specified criteria and capture the range of the uncertainty. The correlation of the spatiotemporal residuals is modeled using a non-separable space-time variogram based on the Spartan covariance family (Hristopulos and Elogne 2007, Varouchakis and Hristopulos 2013). We apply these simulation methods to investigate the uncertainty of groundwater level variations. The available dataset consists of bi-annual (dry and wet hydrological period) groundwater level measurements in 15 monitoring locations for the time period 1981 to 2010. The space-time trend function is approximated using a physical law that governs the groundwater flow in the aquifer in the presence of pumping. The main objective of this research is to compare the performance of two simulation methods for prediction uncertainty estimation. In addition, we investigate the performance of the Spartan spatiotemporal covariance function for spatiotemporal geostatistical analysis. Hristopulos, D.T. and Elogne, S.N. 2007. Analytic properties and covariance functions for a new class of generalized Gibbs

  12. Optimizing Grippers for Compensating Pose Uncertainties by Dynamic Simulation

    DEFF Research Database (Denmark)

    Wolniakowski, Adam; Kramberger, Aljaž; Gams, Andrej

    2017-01-01

    Gripper design process is one of the interesting challenges in the context of grasping within industry. Typically, simple parallel-finger grippers, which are easy to install and maintain, are used in platforms for robotic grasping. The context switches in these platforms require frequent exchange......, we have presented a method to automatically compute the optimal finger shapes for defined task contexts in simulation. In this paper, we show the performance of our method in an industrial grasping scenario. We first analyze the uncertainties of the used vision system, which are the major source...

  13. Measurement uncertainty of dissolution test of acetaminophen immediate release tablets using Monte Carlo simulations

    Directory of Open Access Journals (Sweden)

    Daniel Cancelli Romero

    2017-10-01

    Full Text Available ABSTRACT Analytical results are widely used to assess batch-by-batch conformity, pharmaceutical equivalence, as well as in the development of drug products. Despite this, few papers describing the measurement uncertainty estimation associated with these results were found in the literature. Here, we described a simple procedure used for estimating measurement uncertainty associated with the dissolution test of acetaminophen tablets. A fractionate factorial design was used to define a mathematical model that explains the amount of acetaminophen dissolved (% as a function of time of dissolution (from 20 to 40 minutes, volume of dissolution media (from 800 to 1000 mL, pH of dissolution media (from 2.0 to 6.8, and rotation speed (from 40 to 60 rpm. Using Monte Carlo simulations, we estimated measurement uncertainty for dissolution test of acetaminophen tablets (95.2 ± 1.0%, with a 95% confidence level. Rotation speed was the most important source of uncertainty, contributing about 96.2% of overall uncertainty. Finally, it is important to note that the uncertainty calculated in this paper reflects the expected uncertainty to the dissolution test, and does not consider variations in the content of acetaminophen.

  14. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2O emissions

    Science.gov (United States)

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...

  15. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    Science.gov (United States)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  16. Socio-economic consequences of Chernobyl catastrophe. Social protection of the citizens, affected owing to Chernobyl catastrophe

    International Nuclear Information System (INIS)

    Kholosha, V.; Kovalchuk, V.

    2003-01-01

    The accident on Chernobyl NPP has affected the destiny of 35 million people in Ukraine. The social protection of the population affected during Chernobyl catastrophe is founded on the Law of Ukraine 'About the status and social protection of citizens affected owing to Chernobyl catastrophe' (see further - 'Law'), and is the principal direction of activity and the subject of the special state attention to total complex of problems bound to Chernobyl catastrophe consequences elimination. The current legislation stipulates partial compensation of material losses connected with resettlement of the affected population. According to the current legislation in Ukraine about 50 kinds of aid, privileges and compensations are submitted to the affected citizens

  17. An Indirect Simulation-Optimization Model for Determining Optimal TMDL Allocation under Uncertainty

    Directory of Open Access Journals (Sweden)

    Feng Zhou

    2015-11-01

    Full Text Available An indirect simulation-optimization model framework with enhanced computational efficiency and risk-based decision-making capability was developed to determine optimal total maximum daily load (TMDL allocation under uncertainty. To convert the traditional direct simulation-optimization model into our indirect equivalent model framework, we proposed a two-step strategy: (1 application of interval regression equations derived by a Bayesian recursive regression tree (BRRT v2 algorithm, which approximates the original hydrodynamic and water-quality simulation models and accurately quantifies the inherent nonlinear relationship between nutrient load reductions and the credible interval of algal biomass with a given confidence interval; and (2 incorporation of the calibrated interval regression equations into an uncertain optimization framework, which is further converted to our indirect equivalent framework by the enhanced-interval linear programming (EILP method and provides approximate-optimal solutions at various risk levels. The proposed strategy was applied to the Swift Creek Reservoir’s nutrient TMDL allocation (Chesterfield County, VA to identify the minimum nutrient load allocations required from eight sub-watersheds to ensure compliance with user-specified chlorophyll criteria. Our results indicated that the BRRT-EILP model could identify critical sub-watersheds faster than the traditional one and requires lower reduction of nutrient loadings compared to traditional stochastic simulation and trial-and-error (TAE approaches. This suggests that our proposed framework performs better in optimal TMDL development compared to the traditional simulation-optimization models and provides extreme and non-extreme tradeoff analysis under uncertainty for risk-based decision making.

  18. Multiple Sclerosis and Catastrophic Health Expenditure in Iran.

    Science.gov (United States)

    Juyani, Yaser; Hamedi, Dorsa; Hosseini Jebeli, Seyede Sedighe; Qasham, Maryam

    2016-09-01

    There are many disabling medical conditions which can result in catastrophic health expenditure. Multiple Sclerosis is one of the most costly medical conditions through the world which encounter families to the catastrophic health expenditures. This study aims to investigate on what extent Multiple sclerosis patients face catastrophic costs. This study was carried out in Ahvaz, Iran (2014). The study population included households that at least one of their members suffers from MS. To analyze data, Logit regression model was employed by using the default software STATA12. 3.37% of families were encountered with catastrophic costs. Important variables including brand of drug, housing, income and health insurance were significantly correlated with catastrophic expenditure. This study suggests that although a small proportion of MS patients met the catastrophic health expenditure, mechanisms that pool risk and cost (e.g. health insurance) are required to protect them and improve financial and access equity in health care.

  19. Outcome and value uncertainties in global-change policy

    International Nuclear Information System (INIS)

    Hammitt, J.K.

    1995-01-01

    Choices among environmental policies can be informed by analysis of the potential physical, biological, and social outcomes of alternative choices, and analysis of social preferences among these outcomes. Frequently, however, the consequences of alternative policies cannot be accurately predicted because of substantial outcome uncertainties concerning physical, chemical, biological, and social processes linking policy choices to consequences. Similarly, assessments of social preferences among alternative outcomes are limited by value uncertainties arising from limitations of moral principles, the absence of economic markets for many environmental attributes, and other factors. Outcome and value uncertainties relevant to global-change policy are described and their magnitudes are examined for two cases: stratospheric-ozone depletion and global climate change. Analysis of information available in the mid 1980s, when international ozone regulations were adopted, suggests that contemporary uncertainties surrounding CFC emissions and the atmospheric response were so large that plausible ozone depletion, absent regulation, ranged from negligible to catastrophic, a range that exceeded the plausible effect of the regulations considered. Analysis of climate change suggests that, important as outcome uncertainties are, uncertainties about values may be even more important for policy choice. 53 refs., 3 figs., 3 tabs

  20. Hydrological simulation and uncertainty analysis using the improved TOPMODEL in the arid Manas River basin, China.

    Science.gov (United States)

    Xue, Lianqing; Yang, Fan; Yang, Changbing; Wei, Guanghui; Li, Wenqian; He, Xinlin

    2018-01-11

    Understanding the mechanism of complicated hydrological processes is important for sustainable management of water resources in an arid area. This paper carried out the simulations of water movement for the Manas River Basin (MRB) using the improved semi-distributed Topographic hydrologic model (TOPMODEL) with a snowmelt model and topographic index algorithm. A new algorithm is proposed to calculate the curve of topographic index using internal tangent circle on a conical surface. Based on the traditional model, the improved indicator of temperature considered solar radiation is used to calculate the amount of snowmelt. The uncertainty of parameters for the TOPMODEL model was analyzed using the generalized likelihood uncertainty estimation (GLUE) method. The proposed model shows that the distribution of the topographic index is concentrated in high mountains, and the accuracy of runoff simulation has certain enhancement by considering radiation. Our results revealed that the performance of the improved TOPMODEL is acceptable and comparable to runoff simulation in the MRB. The uncertainty of the simulations resulted from the parameters and structures of model, climatic and anthropogenic factors. This study is expected to serve as a valuable complement for widely application of TOPMODEL and identify the mechanism of hydrological processes in arid area.

  1. DOWNWARD CATASTROPHE OF SOLAR MAGNETIC FLUX ROPES

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Quanhao; Wang, Yuming; Hu, Youqiu; Liu, Rui, E-mail: zhangqh@mail.ustc.edu.cn [CAS Key Laboratory of Geospace Environment, Department of Geophysics and Planetary Sciences, University of Science and Technology of China, Hefei 230026 (China)

    2016-07-10

    2.5-dimensional time-dependent ideal magnetohydrodynamic (MHD) models in Cartesian coordinates were used in previous studies to seek MHD equilibria involving a magnetic flux rope embedded in a bipolar, partially open background field. As demonstrated by these studies, the equilibrium solutions of the system are separated into two branches: the flux rope sticks to the photosphere for solutions at the lower branch but is suspended in the corona for those at the upper branch. Moreover, a solution originally at the lower branch jumps to the upper, as the related control parameter increases and reaches a critical value, and the associated jump is here referred to as an upward catastrophe. The present paper advances these studies in three aspects. First, the magnetic field is changed to be force-free; the system still experiences an upward catastrophe with an increase in each control parameter. Second, under the force-free approximation, there also exists a downward catastrophe, characterized by the jump of a solution from the upper branch to the lower. Both catastrophes are irreversible processes connecting the two branches of equilibrium solutions so as to form a cycle. Finally, the magnetic energy in the numerical domain is calculated. It is found that there exists a magnetic energy release for both catastrophes. The Ampère's force, which vanishes everywhere for force-free fields, appears only during the catastrophes and does positive work, which serves as a major mechanism for the energy release. The implications of the downward catastrophe and its relevance to solar activities are briefly discussed.

  2. DOWNWARD CATASTROPHE OF SOLAR MAGNETIC FLUX ROPES

    International Nuclear Information System (INIS)

    Zhang, Quanhao; Wang, Yuming; Hu, Youqiu; Liu, Rui

    2016-01-01

    2.5-dimensional time-dependent ideal magnetohydrodynamic (MHD) models in Cartesian coordinates were used in previous studies to seek MHD equilibria involving a magnetic flux rope embedded in a bipolar, partially open background field. As demonstrated by these studies, the equilibrium solutions of the system are separated into two branches: the flux rope sticks to the photosphere for solutions at the lower branch but is suspended in the corona for those at the upper branch. Moreover, a solution originally at the lower branch jumps to the upper, as the related control parameter increases and reaches a critical value, and the associated jump is here referred to as an upward catastrophe. The present paper advances these studies in three aspects. First, the magnetic field is changed to be force-free; the system still experiences an upward catastrophe with an increase in each control parameter. Second, under the force-free approximation, there also exists a downward catastrophe, characterized by the jump of a solution from the upper branch to the lower. Both catastrophes are irreversible processes connecting the two branches of equilibrium solutions so as to form a cycle. Finally, the magnetic energy in the numerical domain is calculated. It is found that there exists a magnetic energy release for both catastrophes. The Ampère's force, which vanishes everywhere for force-free fields, appears only during the catastrophes and does positive work, which serves as a major mechanism for the energy release. The implications of the downward catastrophe and its relevance to solar activities are briefly discussed.

  3. Treatment simulation approaches for the estimation of the distributions of treatment quality parameters generated by geometrical uncertainties

    International Nuclear Information System (INIS)

    Baum, C; Alber, M; Birkner, M; Nuesslin, F

    2004-01-01

    Geometric uncertainties arise during treatment planning and treatment and mean that dose-dependent parameters such as EUD are random variables with a patient specific probability distribution. Treatment planning with highly conformal treatment techniques such as intensity modulated radiation therapy requires new evaluation tools which allow us to estimate this influence of geometrical uncertainties on the probable treatment dose for a planned dose distribution. Monte Carlo simulations of treatment courses with recalculation of the dose according to the daily geometric errors are a gold standard for such an evaluation. Distribution histograms which show the relative frequency of a treatment quality parameter in the treatment simulations can be used to evaluate the potential risks and chances of a planned dose distribution. As treatment simulations with dose recalculation are very time consuming for sufficient statistical accuracy, it is proposed to do treatment simulations in the dose parameter space where the result is mainly determined by the systematic and random component of the geometrical uncertainties. Comparison of the parameter space simulation method with the gold standard for prostate cases and a head and neck case shows good agreement as long as the number of fractions is high enough and the influence of tissue inhomogeneities and surface curvature on the dose is small

  4. Catastrophic Antiphospholipid Syndrome

    Directory of Open Access Journals (Sweden)

    Rawhya R. El-Shereef

    2016-01-01

    Full Text Available This paper reports one case of successfully treated patients suffering from a rare entity, the catastrophic antiphospholipid syndrome (CAPS. Management of this patient is discussed in detail.

  5. 1.5 °C ? - Solutions for avoiding catastrophic climate change in this century

    Science.gov (United States)

    Xu, Y.

    2017-12-01

    The historic Paris Agreement calls for limiting global temperature rise to "well below 2 °C." Because of uncertainties in emission scenarios, climate, and carbon cycle feedback, we interpret the Paris Agreement in terms of three climate risk categories and bring in considerations of low-probability (5%) high impact (LPHI) warming in addition to the central (˜50% probability) value. The current risk category of dangerous warming is extended to more categories, which are defined by us here as follows: >1.5 °C as dangerous; >3 °C as catastrophic; and >5 °C as unknown, implying beyond catastrophic, including existential threats. With unchecked emissions, the central warming can reach the dangerous level within three decades, with the LPHI warming becoming catastrophic by 2050. We outline a three-lever strategy to limit the central warming below the dangerous level and the LPHI below the catastrophic level, both in the near term (pollutant (SP) lever to mitigate short-lived climate pollutants, and the carbon extraction and sequestration (CES) lever to thin the atmospheric CO2 blanket. Pulling on both CN and SP levers and bending the emissions curve by 2020 can keep the central warming below dangerous levels. To limit the LPHI warming below dangerous levels, the CES lever must be pulled as well to extract as much as 1 trillion tons of CO2 before 2100 to both limit the preindustrial to 2100 cumulative net CO2 emissions to 2.2 trillion tons and bend the warming curve to a cooling trend. In addition to present the analysis above, I will also share (1) perspective on developed and developing world actions and interactions on climate solutions; (2) Prof V. Ramanathan's interactions with the Pontifical Academy of Sciences and other religious groups which are highly valuable to the interdisciplinary audience.

  6. Catastrophe theory and its application status in mechanical engineering

    Directory of Open Access Journals (Sweden)

    Jinge LIU

    Full Text Available Catastrophe theory is a kind of mathematical method which aims to apply and interpret the discontinuous phenomenon. Since its emergence, it has been widely used to explain a variety of emergent phenomena in the fields of natural science, social science, management science and some other science and technology fields. Firstly, this paper introduces the theory of catastrophe in several aspects, such as its generation, radical principle, basic characteristics and development. Secondly, it summarizes the main applications of catastrophe theory in the field of mechanical engineering, focusing on the research progress of catastrophe theory in revealing catastrophe of rotor vibration state, analyzing friction and wear failure, predicting metal fracture, and so on. Finally, it advises that later development of catastrophe theory should pay more attention to the combination of itself with other traditional nonlinear theories and methods. This paper provides a beneficial reference to guide the application of catastrophe theory in mechanical engineering and related fields for later research.

  7. Extensional rheometer based on viscoelastic catastrophes outline

    DEFF Research Database (Denmark)

    2014-01-01

    The present invention relates to a method and a device for determining viscoelastic properties of a fluid. The invention resides inter alia in the generation of viscoelastic catastrophes in confined systems for use in the context of extensional rheology. The viscoelastic catastrophe is according ...... to the invention generated in a bistable fluid system, and the flow conditions for which the catastrophe occurs can be used as a fingerprint of the fluid's viscoelastic properties in extensional flow....

  8. Sensitivity analysis and uncertainties simulation of the migration of radionuclide in the system of geological disposal-CRP-GEORC model

    International Nuclear Information System (INIS)

    Su Rui; Wang Ju; Chen Weiming; Zong Zihua; Zhao Honggang

    2008-01-01

    CRP-GEORC concept model is an artificial system of geological disposal for High-Level radioactive waste. Sensitivity analysis and uncertainties simulation of the migration of radionuclide Se-79 and I-129 in the far field of this system by using GoldSim Code have been conducted. It can be seen from the simulation results that variables used to describe the geological features and characterization of groundwater flow are sensitive variables of whole geological disposal system. The uncertainties of parameters have remarkable influence on the simulation results. (authors)

  9. Performance and Uncertainty Evaluation of Snow Models on Snowmelt Flow Simulations over a Nordic Catchment (Mistassibi, Canada

    Directory of Open Access Journals (Sweden)

    Magali Troin

    2015-11-01

    Full Text Available An analysis of hydrological response to a multi-model approach based on an ensemble of seven snow models (SM; degree-day and mixed degree-day/energy balance models coupled with three hydrological models (HM is presented for a snowmelt-dominated basin in Canada. The present study aims to compare the performance and the reliability of different types of SM-HM combinations at simulating snowmelt flows over the 1961–2000 historical period. The multi-model approach also allows evaluating the uncertainties associated with the structure of the SM-HM ensemble to better predict river flows in Nordic environments. The 20-year calibration shows a satisfactory performance of the ensemble of 21 SM-HM combinations at simulating daily discharges and snow water equivalents (SWEs, with low streamflow volume biases. The validation of the ensemble of 21 SM-HM combinations is conducted over a 20-year period. Performances are similar to the calibration in simulating the daily discharges and SWEs, again with low model biases for streamflow. The spring-snowmelt-generated peak flow is captured only in timing by the ensemble of 21 SM-HM combinations. The results of specific hydrologic indicators show that the uncertainty related to the choice of the given HM in the SM-HM combinations cannot be neglected in a more quantitative manner in simulating snowmelt flows. The selection of the SM plays a larger role than the choice of the SM approach (degree-day versus mixed degree-day/energy balance in simulating spring flows. Overall, the snow models provide a low degree of uncertainty to the total uncertainty in hydrological modeling for snow hydrology studies.

  10. Parameterization and Uncertainty Analysis of SWAT model in Hydrological Simulation of Chaohe River Basin

    Science.gov (United States)

    Jie, M.; Zhang, J.; Guo, B. B.

    2017-12-01

    As a typical distributed hydrological model, the SWAT model also has a challenge in calibrating parameters and analysis their uncertainty. This paper chooses the Chaohe River Basin China as the study area, through the establishment of the SWAT model, loading the DEM data of the Chaohe river basin, the watershed is automatically divided into several sub-basins. Analyzing the land use, soil and slope which are on the basis of the sub-basins and calculating the hydrological response unit (HRU) of the study area, after running SWAT model, the runoff simulation values in the watershed are obtained. On this basis, using weather data, known daily runoff of three hydrological stations, combined with the SWAT-CUP automatic program and the manual adjustment method are used to analyze the multi-site calibration of the model parameters. Furthermore, the GLUE algorithm is used to analyze the parameters uncertainty of the SWAT model. Through the sensitivity analysis, calibration and uncertainty study of SWAT, the results indicate that the parameterization of the hydrological characteristics of the Chaohe river is successful and feasible which can be used to simulate the Chaohe river basin.

  11. Evaluation of global fine-resolution precipitation products and their uncertainty quantification in ensemble discharge simulations

    Science.gov (United States)

    Qi, W.; Zhang, C.; Fu, G.; Sweetapple, C.; Zhou, H.

    2016-02-01

    The applicability of six fine-resolution precipitation products, including precipitation radar, infrared, microwave and gauge-based products, using different precipitation computation recipes, is evaluated using statistical and hydrological methods in northeastern China. In addition, a framework quantifying uncertainty contributions of precipitation products, hydrological models, and their interactions to uncertainties in ensemble discharges is proposed. The investigated precipitation products are Tropical Rainfall Measuring Mission (TRMM) products (TRMM3B42 and TRMM3B42RT), Global Land Data Assimilation System (GLDAS)/Noah, Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and a Global Satellite Mapping of Precipitation (GSMAP-MVK+) product. Two hydrological models of different complexities, i.e. a water and energy budget-based distributed hydrological model and a physically based semi-distributed hydrological model, are employed to investigate the influence of hydrological models on simulated discharges. Results show APHRODITE has high accuracy at a monthly scale compared with other products, and GSMAP-MVK+ shows huge advantage and is better than TRMM3B42 in relative bias (RB), Nash-Sutcliffe coefficient of efficiency (NSE), root mean square error (RMSE), correlation coefficient (CC), false alarm ratio, and critical success index. These findings could be very useful for validation, refinement, and future development of satellite-based products (e.g. NASA Global Precipitation Measurement). Although large uncertainty exists in heavy precipitation, hydrological models contribute most of the uncertainty in extreme discharges. Interactions between precipitation products and hydrological models can have the similar magnitude of contribution to discharge uncertainty as the hydrological models. A

  12. 48{sup th} Annual meeting on nuclear technology (AMNT 2017). Key topic / Enhanced safety and operation excellence. Focus session: Uncertainty analyses in reactor core simulations

    Energy Technology Data Exchange (ETDEWEB)

    Zwermann, Winfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany). Forschungszentrum

    2017-12-15

    The supplementation of reactor simulations by uncertainty analyses is becoming increasingly important internationally due to the fact that the reliability of simulation calculations can be significantly increased by the quantification of uncertainties in comparison to the use of so-called conservative methods (BEPU- ''Best-Estimate plus Uncertainties''). While systematic uncertainty analyses for thermo-hydraulic calculations have been performed routinely for a long time, methods for taking into account uncertainties in nuclear data, which are the basis for neutron transport calculations, are under development. The Focus Session Uncertainty Analyses in Reactor Core Simulations was intended to provide an overview of international research and development with respect to supplementing reactor core simulations with uncertainty and sensitivity analyses, in research institutes as well as within the nuclear industry. The presented analyses not only focused on light water reactors, but also on advanced reactor systems. Particular emphasis was put on international benchmarks in the field. The session was chaired by Winfried Zwermann (Gesellschaft fuer Anlagen- und Reaktorsicherheit).

  13. PARAMETRIC INSURANCE COVER FOR NATURAL CATASTROPHE RISKS

    Directory of Open Access Journals (Sweden)

    Serghei Margulescu

    2013-11-01

    Full Text Available With economic losses of over USD 370 bn caused by 325 catastrophic events, 2011 ranks as the worst ever year in terms of costs to society due to natural catastrophes and man-made disasters. Inthe same time, 2011 is the second most expensive year in the history for the insurance industry, with insured losses from catastrophic events amounting to USD 116 bn. Both the high level of damages and insured losses, as well as the unprecedented gap between the two values, made insurers and reinsurers worldwide to understand that some risks had so far been underestimated and they have to be better integrated in the catastrophes modelling.On the other hand, governments have to protect themselves against the financial impact of natural catastrophes and new forms of cooperation between the public and private sectors can help countries finance disaster risks. Viewed in a country’s wider risk management context, the purchase of parametric insurance cover, which transfers natural catastrophe risk to the private sector using an index- based trigger, is a necessary shift towards a pre-emptive risk management strategy. This kind of approach can be pursued by central governments or at the level of provincial or municipal governments, and a number of case studies included in the publication “Closing the financial gap” by Swiss Re (2011 illustrates how new forms of parametric insurance can help countries finance disaster risks.

  14. Simulating and quantifying legacy topographic data uncertainty: an initial step to advancing topographic change analyses

    Science.gov (United States)

    Wasklewicz, Thad; Zhu, Zhen; Gares, Paul

    2017-12-01

    Rapid technological advances, sustained funding, and a greater recognition of the value of topographic data have helped develop an increasing archive of topographic data sources. Advances in basic and applied research related to Earth surface changes require researchers to integrate recent high-resolution topography (HRT) data with the legacy datasets. Several technical challenges and data uncertainty issues persist to date when integrating legacy datasets with more recent HRT data. The disparate data sources required to extend the topographic record back in time are often stored in formats that are not readily compatible with more recent HRT data. Legacy data may also contain unknown error or unreported error that make accounting for data uncertainty difficult. There are also cases of known deficiencies in legacy datasets, which can significantly bias results. Finally, scientists are faced with the daunting challenge of definitively deriving the extent to which a landform or landscape has or will continue to change in response natural and/or anthropogenic processes. Here, we examine the question: how do we evaluate and portray data uncertainty from the varied topographic legacy sources and combine this uncertainty with current spatial data collection techniques to detect meaningful topographic changes? We view topographic uncertainty as a stochastic process that takes into consideration spatial and temporal variations from a numerical simulation and physical modeling experiment. The numerical simulation incorporates numerous topographic data sources typically found across a range of legacy data to present high-resolution data, while the physical model focuses on more recent HRT data acquisition techniques. Elevation uncertainties observed from anchor points in the digital terrain models are modeled using "states" in a stochastic estimator. Stochastic estimators trace the temporal evolution of the uncertainties and are natively capable of incorporating sensor

  15. Uncertainty quantification tools for multiphase gas-solid flow simulations using MFIX

    Energy Technology Data Exchange (ETDEWEB)

    Fox, Rodney O. [Iowa State Univ., Ames, IA (United States); Passalacqua, Alberto [Iowa State Univ., Ames, IA (United States)

    2016-02-01

    Computational fluid dynamics (CFD) has been widely studied and used in the scientific community and in the industry. Various models were proposed to solve problems in different areas. However, all models deviate from reality. Uncertainty quantification (UQ) process evaluates the overall uncertainties associated with the prediction of quantities of interest. In particular it studies the propagation of input uncertainties to the outputs of the models so that confidence intervals can be provided for the simulation results. In the present work, a non-intrusive quadrature-based uncertainty quantification (QBUQ) approach is proposed. The probability distribution function (PDF) of the system response can be then reconstructed using extended quadrature method of moments (EQMOM) and extended conditional quadrature method of moments (ECQMOM). The report first explains the theory of QBUQ approach, including methods to generate samples for problems with single or multiple uncertain input parameters, low order statistics, and required number of samples. Then methods for univariate PDF reconstruction (EQMOM) and multivariate PDF reconstruction (ECQMOM) are explained. The implementation of QBUQ approach into the open-source CFD code MFIX is discussed next. At last, QBUQ approach is demonstrated in several applications. The method is first applied to two examples: a developing flow in a channel with uncertain viscosity, and an oblique shock problem with uncertain upstream Mach number. The error in the prediction of the moment response is studied as a function of the number of samples, and the accuracy of the moments required to reconstruct the PDF of the system response is discussed. The QBUQ approach is then demonstrated by considering a bubbling fluidized bed as example application. The mean particle size is assumed to be the uncertain input parameter. The system is simulated with a standard two-fluid model with kinetic theory closures for the particulate phase implemented into

  16. Applications of modelling historical catastrophic events with implications for catastrophe risk management

    Science.gov (United States)

    Sorby, A.; Grossi, P.; Pomonis, A.; Williams, C.; Nyst, M.; Onur, T.; Seneviratna, P.; Baca, A.

    2009-04-01

    The management of catastrophe risk is concerned with the quantification of financial losses, and their associated probabilities, for potential future catastrophes that might impact a region. Modelling of historical catastrophe events and, in particular, the potential consequences if a similar event were to occur at the present day can provide insight to help bridge the gap between what we know can happen from historical experience and what potential losses might be out there in the "universe" of potential catastrophes. The 1908 Messina Earthquake (and accompanying local tsunami) was one of the most destructive earthquakes to have occurred in Europe and by most accounts remains Europe's most fatal with over 70,000 casualties estimated. However, what would the potential consequences be, in terms of financial and human losses, if a similar earthquake were to occur at the present day? Exposures, building stock and populations can change over time and, therefore, the consequences of a similar earthquake at the present day may sensibly differ from those observed in 1908. The city of Messina has been reconstructed several times in its history, including following the 1908 earthquake and again following the Second World War. The 1908 earthquake prompted the introduction of the first seismic design regulations in Italy and since 1909 parts of the Messina and Calabria regions have been in the zones of highest seismic coefficient. Utilizing commercial catastrophe loss modelling technology - which combines the modelling of hazard, vulnerability, and financial losses on a database of property exposures - a modelled earthquake scenario of M7.2 in the Messina Straits region of Southern Italy is considered. This modelled earthquake is used to assess the potential consequences in terms of financial losses that an earthquake similar to the 1908 earthquake might have if it were to occur at the present day. Loss results are discussed in the context of applications for the financial

  17. Uncertainty and sensitivity analysis in the neutronic parameters generation for BWR and PWR coupled thermal-hydraulic–neutronic simulations

    International Nuclear Information System (INIS)

    Ánchel, F.; Barrachina, T.; Miró, R.; Verdú, G.; Juanas, J.; Macián-Juan, R.

    2012-01-01

    Highlights: ► Best-estimate codes are affected by the uncertainty in the methods and the models. ► Influence of the uncertainty in the macroscopic cross-sections in a BWR and PWR RIA accidents analysis. ► The fast diffusion coefficient, the scattering cross section and both fission cross sections are the most influential factors. ► The absorption cross sections very little influence. ► Using a normal pdf the results are more “conservative” comparing the power peak reached with uncertainty quantified with a uniform pdf. - Abstract: The Best Estimate analysis consists of a coupled thermal-hydraulic and neutronic description of the nuclear system's behavior; uncertainties from both aspects should be included and jointly propagated. This paper presents a study of the influence of the uncertainty in the macroscopic neutronic information that describes a three-dimensional core model on the most relevant results of the simulation of a Reactivity Induced Accident (RIA). The analyses of a BWR-RIA and a PWR-RIA have been carried out with a three-dimensional thermal-hydraulic and neutronic model for the coupled system TRACE-PARCS and RELAP-PARCS. The cross section information has been generated by the SIMTAB methodology based on the joint use of CASMO-SIMULATE. The statistically based methodology performs a Monte-Carlo kind of sampling of the uncertainty in the macroscopic cross sections. The size of the sampling is determined by the characteristics of the tolerance intervals by applying the Noether–Wilks formulas. A number of simulations equal to the sample size have been carried out in which the cross sections used by PARCS are directly modified with uncertainty, and non-parametric statistical methods are applied to the resulting sample of the values of the output variables to determine their intervals of tolerance.

  18. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Science.gov (United States)

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  19. Academic Training: Predicting Natural Catastrophes

    CERN Multimedia

    Françoise Benz

    2005-01-01

    2005-2006 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 12, 13, 14, 15, 16 December from 11:00 to 12:00 - Main Auditorium, bldg. 500 Predicting Natural Catastrophes E. OKAL / Northwestern University, Evanston, USA 1. Tsunamis -- Introduction Definition of phenomenon - basic properties of the waves Propagation and dispersion Interaction with coasts - Geological and societal effects Origin of tsunamis - natural sources Scientific activities in connection with tsunamis. Ideas about simulations 2. Tsunami generation The earthquake source - conventional theory The earthquake source - normal mode theory The landslide source Near-field observation - The Plafker index Far-field observation - Directivity 3. Tsunami warning General ideas - History of efforts Mantle magnitudes and TREMOR algorithms The challenge of 'tsunami earthquakes' Energy-moment ratios and slow earthquakes Implementation and the components of warning centers 4. Tsunami surveys Principles and methodologies Fifteen years of field surveys and re...

  20. Manipulation of pain catastrophizing: An experimental study of healthy participants

    Directory of Open Access Journals (Sweden)

    Joel E Bialosky

    2008-11-01

    Full Text Available Joel E Bialosky1*, Adam T Hirsh2,3, Michael E Robinson2,3, Steven Z George1,3*1Department of Physical Therapy; 2Department of Clinical and Health Psychology; 3Center for Pain Research and Behavioral Health, University of Florida, Gainesville, Florida, USAAbstract: Pain catastrophizing is associated with the pain experience; however, causation has not been established. Studies which specifically manipulate catastrophizing are necessary to establish causation. The present study enrolled 100 healthy individuals. Participants were randomly assigned to repeat a positive, neutral, or one of three catastrophizing statements during a cold pressor task (CPT. Outcome measures of pain tolerance and pain intensity were recorded. No change was noted in catastrophizing immediately following the CPT (F(1,84 = 0.10, p = 0.75, partial η2 < 0.01 independent of group assignment (F(4,84 = 0.78, p = 0.54, partial η2 = 0.04. Pain tolerance (F(4 = 0.67, p = 0.62, partial η2 = 0.03 and pain intensity (F(4 = 0.73, p = 0.58, partial η2 = 0.03 did not differ by group. This study suggests catastrophizing may be difficult to manipulate through experimental pain procedures and repetition of specific catastrophizing statements was not sufficient to change levels of catastrophizing. Additionally, pain tolerance and pain intensity did not differ by group assignment. This study has implications for future studies attempting to experimentally manipulate pain catastrophizing.Keywords: pain, catastrophizing, experimental, cold pressor task, pain catastrophizing scale

  1. Valuing Catastrophe Bonds Involving Credit Risks

    Directory of Open Access Journals (Sweden)

    Jian Liu

    2014-01-01

    Full Text Available Catastrophe bonds are the most important products in catastrophe risk securitization market. For the operating mechanism, CAT bonds may have a credit risk, so in this paper we consider the influence of the credit risk on CAT bonds pricing that is different from the other literature. We employ the Jarrow and Turnbull method to model the credit risks and get access to the general pricing formula using the Extreme Value Theory. Furthermore, we present an empirical pricing study of the Property Claim Services data, where the parameters in the loss function distribution are estimated by the MLE method and the default probabilities are deduced by the US financial market data. Then we get the catastrophe bonds value by the Monte Carlo method.

  2. Does catastrophic thinking enhance oesophageal pain sensitivity?

    DEFF Research Database (Denmark)

    Martel, M O; Olesen, A E; Jørgensen, D

    2016-01-01

    that catastrophic thinking exerts an influence on oesophageal pain sensitivity, but not necessarily on the magnitude of acid-induced oesophageal sensitization. WHAT DOES THIS STUDY ADD?: Catastrophizing is associated with heightened pain sensitivity in the oesophagus. This was substantiated by assessing responses...

  3. Assessing the impact of model and climate uncertainty in malaria simulations for the Kenyan Highlands.

    Science.gov (United States)

    Tompkins, A. M.; Thomson, M. C.

    2017-12-01

    Simulations of the impact of climate variations on a vector-bornedisease such as malaria are subject to a number of sources ofuncertainty. These include the model structure and parameter settingsin addition to errors in the climate data and the neglect of theirspatial heterogeneity, especially over complex terrain. We use aconstrained genetic algorithm to confront these two sources ofuncertainty for malaria transmission in the highlands of Kenya. Thetechnique calibrates the parameter settings of a process-based,mathematical model of malaria transmission to vary within theirassessed level of uncertainty and also allows the calibration of thedriving climate data. The simulations show that in highland settingsclose to the threshold for sustained transmission, the uncertainty inclimate is more important to address than the malaria modeluncertainty. Applications of the coupled climate-malaria modelling system are briefly presented.

  4. Attributing uncertainty in streamflow simulations due to variable inputs via the Quantile Flow Deviation metric

    Science.gov (United States)

    Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish

    2018-06-01

    Every model to characterise a real world process is affected by uncertainty. Selecting a suitable model is a vital aspect of engineering planning and design. Observation or input errors make the prediction of modelled responses more uncertain. By way of a recently developed attribution metric, this study is aimed at developing a method for analysing variability in model inputs together with model structure variability to quantify their relative contributions in typical hydrological modelling applications. The Quantile Flow Deviation (QFD) metric is used to assess these alternate sources of uncertainty. The Australian Water Availability Project (AWAP) precipitation data for four different Australian catchments is used to analyse the impact of spatial rainfall variability on simulated streamflow variability via the QFD. The QFD metric attributes the variability in flow ensembles to uncertainty associated with the selection of a model structure and input time series. For the case study catchments, the relative contribution of input uncertainty due to rainfall is higher than that due to potential evapotranspiration, and overall input uncertainty is significant compared to model structure and parameter uncertainty. Overall, this study investigates the propagation of input uncertainty in a daily streamflow modelling scenario and demonstrates how input errors manifest across different streamflow magnitudes.

  5. CATASTROPHIC EVENTS MODELING

    Directory of Open Access Journals (Sweden)

    Ciumas Cristina

    2013-07-01

    Full Text Available This paper presents the emergence and evolution of catastrophe models (cat models. Starting with the present context of extreme weather events and features of catastrophic risk (cat risk we’ll make a chronological illustration from a theoretical point of view of the main steps taken for building such models. In this way the importance of interdisciplinary can be observed. The first cat model considered contains three modules. For each of these indentified modules: hazard, vulnerability and financial losses a detailed overview and also an exemplification of a potential case of an earthquake that measures more than 7 on Richter scale occurring nowadays in Bucharest will be provided. The key areas exposed to earthquake in Romania will be identified. Then, based on past catastrophe data and taking into account present conditions of housing stock, insurance coverage and the population of Bucharest the impact will be quantified by determining potential losses. In order to accomplish this work we consider a scenario with data representing average values for: dwelling’s surface, location, finishing works. On each step we’ll make a reference to the earthquake on March 4 1977 to see what would happen today if a similar event occurred. The value of Bucharest housing stock will be determined taking firstly the market value, then the replacement value and ultimately the real value to quantify potential damages. Through this approach we can find the insurance coverage of potential losses and also the uncovered gap. A solution that may be taken into account by public authorities, for example by Bucharest City Hall will be offered: in case such an event occurs the impossibility of paying compensations to insured people, rebuilding infrastructure and public buildings and helping the suffering persons should be avoided. An actively public-private partnership should be created between government authorities, the Natural Disaster Insurance Pool, private

  6. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    International Nuclear Information System (INIS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-01-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has

  7. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach

  8. Kalman filter application to mitigate the errors in the trajectory simulations due to the lunar gravitational model uncertainty

    International Nuclear Information System (INIS)

    Gonçalves, L D; Rocco, E M; De Moraes, R V; Kuga, H K

    2015-01-01

    This paper aims to simulate part of the orbital trajectory of Lunar Prospector mission to analyze the relevance of using a Kalman filter to estimate the trajectory. For this study it is considered the disturbance due to the lunar gravitational potential using one of the most recent models, the LP100K model, which is based on spherical harmonics, and considers the maximum degree and order up to the value 100. In order to simplify the expression of the gravitational potential and, consequently, to reduce the computational effort required in the simulation, in some cases, lower values for degree and order are used. Following this aim, it is made an analysis of the inserted error in the simulations when using such values of degree and order to propagate the spacecraft trajectory and control. This analysis was done using the standard deviation that characterizes the uncertainty for each one of the values of the degree and order used in LP100K model for the satellite orbit. With knowledge of the uncertainty of the gravity model adopted, lunar orbital trajectory simulations may be accomplished considering these values of uncertainty. Furthermore, it was also used a Kalman filter, where is considered the sensor's uncertainty that defines the satellite position at each step of the simulation and the uncertainty of the model, by means of the characteristic variance of the truncated gravity model. Thus, this procedure represents an effort to approximate the results obtained using lower values for the degree and order of the spherical harmonics, to the results that would be attained if the maximum accuracy of the model LP100K were adopted. Also a comparison is made between the error in the satellite position in the situation in which the Kalman filter is used and the situation in which the filter is not used. The data for the comparison were obtained from the standard deviation in the velocity increment of the space vehicle. (paper)

  9. An overview of uncertainty quantification techniques with application to oceanic and oil-spill simulations

    KAUST Repository

    Iskandarani, Mohamed; Wang, Shitao; Srinivasan, Ashwanth; Carlisle Thacker, W.; Winokur, Justin; Knio, Omar

    2016-01-01

    We give an overview of four different ensemble-based techniques for uncertainty quantification and illustrate their application in the context of oil plume simulations. These techniques share the common paradigm of constructing a model proxy that efficiently captures the functional dependence of the model output on uncertain model inputs. This proxy is then used to explore the space of uncertain inputs using a large number of samples, so that reliable estimates of the model's output statistics can be calculated. Three of these techniques use polynomial chaos (PC) expansions to construct the model proxy, but they differ in their approach to determining the expansions' coefficients; the fourth technique uses Gaussian Process Regression (GPR). An integral plume model for simulating the Deepwater Horizon oil-gas blowout provides examples for illustrating the different techniques. A Monte Carlo ensemble of 50,000 model simulations is used for gauging the performance of the different proxies. The examples illustrate how regression-based techniques can outperform projection-based techniques when the model output is noisy. They also demonstrate that robust uncertainty analysis can be performed at a fraction of the cost of the Monte Carlo calculation.

  10. An overview of uncertainty quantification techniques with application to oceanic and oil-spill simulations

    KAUST Repository

    Iskandarani, Mohamed

    2016-04-22

    We give an overview of four different ensemble-based techniques for uncertainty quantification and illustrate their application in the context of oil plume simulations. These techniques share the common paradigm of constructing a model proxy that efficiently captures the functional dependence of the model output on uncertain model inputs. This proxy is then used to explore the space of uncertain inputs using a large number of samples, so that reliable estimates of the model\\'s output statistics can be calculated. Three of these techniques use polynomial chaos (PC) expansions to construct the model proxy, but they differ in their approach to determining the expansions\\' coefficients; the fourth technique uses Gaussian Process Regression (GPR). An integral plume model for simulating the Deepwater Horizon oil-gas blowout provides examples for illustrating the different techniques. A Monte Carlo ensemble of 50,000 model simulations is used for gauging the performance of the different proxies. The examples illustrate how regression-based techniques can outperform projection-based techniques when the model output is noisy. They also demonstrate that robust uncertainty analysis can be performed at a fraction of the cost of the Monte Carlo calculation.

  11. Simulation of corn yields and parameters uncertainties analysis in Hebei and Sichuang, China

    Science.gov (United States)

    Fu, A.; Xue, Y.; Hartman, M. D.; Chandran, A.; Qiu, B.; Liu, Y.

    2016-12-01

    Corn is one of most important agricultural production in China. Research on the impacts of climate change and human activities on corn yields is important in understanding and mitigating the negative effects of environmental factors on corn yields and maintaining the stable corn production. Using climatic data, including daily temperature, precipitation, and solar radiation from 1948 to 2010, soil properties, observed corn yields, and farmland management information, corn yields in Sichuang and Hebei Provinces of China in the past 63 years were simulated using the Daycent model, and the results was evaluated using Root mean square errors, bias, simulation efficiency, and standard deviation. The primary climatic factors influencing corn yields were examined, the uncertainties of climatic factors was analyzed, and the uncertainties of human activity parameters were also studied by changing fertilization levels and cultivated ways. The results showed that: (1) Daycent model is capable to simulate corn yields in Sichuang and Hebei provinces of China. Observed and simulated corn yields have the similar increasing trend with time. (2) The minimum daily temperature is the primary factor influencing corn yields in Sichuang. In Hebei Province, daily temperature, precipitation and wind speed significantly affect corn yields.(3) When the global warming trend of original data was removed, simulated corn yields were lower than before, decreased by about 687 kg/hm2 from 1992 to 2010; When the fertilization levels, cultivated ways were increased and decreased by 50% and 75%, respectively in the Schedule file in Daycent model, the simulated corn yields increased by 1206 kg/hm2 and 776 kg/hm2, respectively, with the enhancement of fertilization level and the improvement of cultivated way. This study provides a scientific base for selecting a suitable fertilization level and cultivated way in corn fields in China.

  12. Stand-alone core sensitivity and uncertainty analysis of ALFRED from Monte Carlo simulations

    International Nuclear Information System (INIS)

    Pérez-Valseca, A.-D.; Espinosa-Paredes, G.; François, J.L.; Vázquez Rodríguez, A.; Martín-del-Campo, C.

    2017-01-01

    Highlights: • Methodology based on Monte Carlo simulation. • Sensitivity analysis of Lead Fast Reactor (LFR). • Uncertainty and regression analysis of LFR. • 10% change in the core inlet flow, the response in thermal power change is 0.58%. • 2.5% change in the inlet lead temperature the response is 1.87% in power. - Abstract: The aim of this paper is the sensitivity and uncertainty analysis of a Lead-Cooled Fast Reactor (LFR) based on Monte Carlo simulation of sizes up to 2000. The methodology developed in this work considers the uncertainty of sensitivities and uncertainty of output variables due to a single-input-variable variation. The Advanced Lead fast Reactor European Demonstrator (ALFRED) is analyzed to determine the behavior of the essential parameters due to effects of mass flow and temperature of liquid lead. The ALFRED core mathematical model developed in this work is fully transient, which takes into account the heat transfer in an annular fuel pellet design, the thermo-fluid in the core, and the neutronic processes, which are modeled with point kinetic with feedback fuel temperature and expansion effects. The sensitivity evaluated in terms of the relative standard deviation (RSD) showed that for 10% change in the core inlet flow, the response in thermal power change is 0.58%, and for 2.5% change in the inlet lead temperature is 1.87%. The regression analysis with mass flow rate as the predictor variable showed statistically valid cubic correlations for neutron flux and linear relationship neutron flux as a function of the lead temperature. No statistically valid correlation was observed for the reactivity as a function of the mass flow rate and for the lead temperature. These correlations are useful for the study, analysis, and design of any LFR.

  13. Simulation codes and the impact of validation/uncertainty requirements

    International Nuclear Information System (INIS)

    Sills, H.E.

    1995-01-01

    Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)

  14. From a Catastrophe Itself to Cata/strophic Reading. The Poetry of Charles Baudelaire in the Account of Jorge Semprúna L’écriture ou la vie

    Directory of Open Access Journals (Sweden)

    Judith Kasper

    2015-01-01

    Full Text Available This essay addresses the instable meaning of the term catastrophe over the course of history. The first part takes leave of the “the tiny fissures” in the continuous catastrophe noted by Walter Benjamin to develop a philology of the cata/strophe. This philology does not only register a given meaning (for instance, of the catastrophe, but intervenes actively as disruption. It insists on the strophe in the catastrophe, transforming catastrophe into cata/strophe that, in fatal situations, permits the poetic potential to become a dynamic force that can, at least on the linguistic level, open toward other dimensions without denying the catastrophe itself. The second part is dedicated to a reading of Jorge Semprún’s autobiographical novel L’écriture ou la vie from the perspective of this philological concept. It seeks to show how Semprún’s citing and reciting of Baudelaire’s strophes in the putrid atmosphere of the Buchenwald concentration camp literally produce, on the level of the signifiers, fresh air to breathe.

  15. Numerical solution of dynamic equilibrium models under Poisson uncertainty

    DEFF Research Database (Denmark)

    Posch, Olaf; Trimborn, Timo

    2013-01-01

    We propose a simple and powerful numerical algorithm to compute the transition process in continuous-time dynamic equilibrium models with rare events. In this paper we transform the dynamic system of stochastic differential equations into a system of functional differential equations of the retar...... solution to Lucas' endogenous growth model under Poisson uncertainty are used to compute the exact numerical error. We show how (potential) catastrophic events such as rare natural disasters substantially affect the economic decisions of households....

  16. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    Science.gov (United States)

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  17. Uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor

    International Nuclear Information System (INIS)

    Ghione, Alberto; Noel, Brigitte; Vinai, Paolo; Demazière, Christophe

    2017-01-01

    Highlights: • A station blackout scenario in the Jules Horowitz Reactor is analyzed using CATHARE. • Input and model uncertainties relevant to the transient, are considered. • A statistical methodology for the propagation of the uncertainties is applied. • No safety criteria are exceeded and sufficiently large safety margins are estimated. • The most influential uncertainties are determined with a sensitivity analysis. - Abstract: An uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor (JHR) is presented. The JHR is a new material testing reactor under construction at CEA on the Cadarache site, France. The thermal-hydraulic system code CATHARE is applied to investigate the response of the reactor system to the scenario. The uncertainty and sensitivity study was based on a statistical methodology for code uncertainty propagation, and the ‘Uncertainty and Sensitivity’ platform URANIE was used. Accordingly, the input uncertainties relevant to the transient, were identified, quantified, and propagated to the code output. The results show that the safety criteria are not exceeded and sufficiently large safety margins exist. In addition, the most influential input uncertainties on the safety parameters were found by making use of a sensitivity analysis.

  18. Evaluating Uncertainty of Runoff Simulation using SWAT model of the Feilaixia Watershed in China Based on the GLUE Method

    Science.gov (United States)

    Chen, X.; Huang, G.

    2017-12-01

    In recent years, distributed hydrological models have been widely used in storm water management, water resources protection and so on. Therefore, how to evaluate the uncertainty of the model reasonably and efficiently becomes a hot topic today. In this paper, the soil and water assessment tool (SWAT) model is constructed for the study area of China's Feilaixia watershed, and the uncertainty of the runoff simulation is analyzed by GLUE method deeply. Taking the initial parameter range of GLUE method as the research core, the influence of different initial parameter ranges on model uncertainty is studied. In this paper, two sets of parameter ranges are chosen as the object of study, the first one (range 1) is recommended by SWAT-CUP and the second one (range 2) is calibrated by SUFI-2. The results showed that under the same number of simulations (10,000 times), the overall uncertainty obtained by the range 2 is less than the range 1. Specifically, the "behavioral" parameter sets for the range 2 is 10000 and for the range 1 is 4448. In the calibration and the validation, the ratio of P-factor to R-factor for range 1 is 1.387 and 1.391, and for range 2 is 1.405 and 1.462 respectively. In addition, the simulation result of range 2 is better with the NS and R2 slightly higher than range 1. Therefore, it can be concluded that using the parameter range calibrated by SUFI-2 as the initial parameter range for the GLUE is a way to effectively capture and evaluate the simulation uncertainty.

  19. The magnitude and causes of uncertainty in global model simulations of cloud condensation nuclei

    Directory of Open Access Journals (Sweden)

    L. A. Lee

    2013-09-01

    Full Text Available Aerosol–cloud interaction effects are a major source of uncertainty in climate models so it is important to quantify the sources of uncertainty and thereby direct research efforts. However, the computational expense of global aerosol models has prevented a full statistical analysis of their outputs. Here we perform a variance-based analysis of a global 3-D aerosol microphysics model to quantify the magnitude and leading causes of parametric uncertainty in model-estimated present-day concentrations of cloud condensation nuclei (CCN. Twenty-eight model parameters covering essentially all important aerosol processes, emissions and representation of aerosol size distributions were defined based on expert elicitation. An uncertainty analysis was then performed based on a Monte Carlo-type sampling of an emulator built for each model grid cell. The standard deviation around the mean CCN varies globally between about ±30% over some marine regions to ±40–100% over most land areas and high latitudes, implying that aerosol processes and emissions are likely to be a significant source of uncertainty in model simulations of aerosol–cloud effects on climate. Among the most important contributors to CCN uncertainty are the sizes of emitted primary particles, including carbonaceous combustion particles from wildfires, biomass burning and fossil fuel use, as well as sulfate particles formed on sub-grid scales. Emissions of carbonaceous combustion particles affect CCN uncertainty more than sulfur emissions. Aerosol emission-related parameters dominate the uncertainty close to sources, while uncertainty in aerosol microphysical processes becomes increasingly important in remote regions, being dominated by deposition and aerosol sulfate formation during cloud-processing. The results lead to several recommendations for research that would result in improved modelling of cloud–active aerosol on a global scale.

  20. Estimation of balance uncertainty using Direct Monte Carlo Simulation (DSMC) on a CPU-GPU architecture

    CSIR Research Space (South Africa)

    Bidgood, Peter M

    2017-01-01

    Full Text Available The estimation of balance uncertainty using conventional statistical and error propagation methods has been found to be both approximate and laborious to the point of being untenable. Direct Simulation by Monte Carlo (DSMC) has been shown...

  1. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  2. Catastrophic primary antiphospholipid syndrome

    International Nuclear Information System (INIS)

    Kim, Dong Hun; Byun, Joo Nam; Ryu, Sang Wan

    2006-01-01

    Catastrophic antiphospholipid syndrome (CAPLS) was diagnosed in a 64-year-old male who was admitted to our hospital with dyspnea. The clinical and radiological examinations showed pulmonary thromboembolism, and so thromboembolectomy was performed. Abdominal distension rapidly developed several days later, and the abdominal computed tomography (CT) abdominal scan revealed thrombus within the superior mesenteric artery with small bowel and gall bladder distension. Cholecystectomy and jejunoileostomy were performed, and gall bladder necrosis and small bowel infarction were confirmed. The anticardiolipin antibody was positive. Anticoagulant agents and steroids were administered, but the patient expired 4 weeks after surgery due to acute respiratory distress syndrome (ARDS). We report here on a case of catastrophic APLS with manifestations of pulmonary thromboembolism, rapidly progressing GB necrosis and bowel infarction

  3. Catastrophic primary antiphospholipid syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong Hun; Byun, Joo Nam [Chosun University Hospital, Gwangju (Korea, Republic of); Ryu, Sang Wan [Miraero21 Medical Center, Gwangju (Korea, Republic of)

    2006-09-15

    Catastrophic antiphospholipid syndrome (CAPLS) was diagnosed in a 64-year-old male who was admitted to our hospital with dyspnea. The clinical and radiological examinations showed pulmonary thromboembolism, and so thromboembolectomy was performed. Abdominal distension rapidly developed several days later, and the abdominal computed tomography (CT) abdominal scan revealed thrombus within the superior mesenteric artery with small bowel and gall bladder distension. Cholecystectomy and jejunoileostomy were performed, and gall bladder necrosis and small bowel infarction were confirmed. The anticardiolipin antibody was positive. Anticoagulant agents and steroids were administered, but the patient expired 4 weeks after surgery due to acute respiratory distress syndrome (ARDS). We report here on a case of catastrophic APLS with manifestations of pulmonary thromboembolism, rapidly progressing GB necrosis and bowel infarction.

  4. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    Science.gov (United States)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-02-01

    We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135 × 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101 × 104 km2). However the uncertainty (1 to 128 × 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future permafrost distribution can be made for

  5. Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow

    Science.gov (United States)

    Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca

    2017-11-01

    The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.

  6. Vaginismus: heightened harm avoidance and pain catastrophizing cognitions.

    Science.gov (United States)

    Borg, Charmaine; Peters, Madelon L; Schultz, Willibrord Weijmar; de Jong, Peter J

    2012-02-01

    Catastrophic appraisal of experienced pain may promote hypervigilance and intense pain, while the personality trait of harm avoidance (HA) might prevent the occurrence of correcting such experiences. Women inflicted with vaginismus may enter a self-perpetuating downward spiral of increasing avoidance of (anticipated) pain. In vaginismus the anticipation of pain may give rise to catastrophic pain ideation. This may establish hypervigilance toward painful sexual stimuli, which consequently results in negative appraisal of sexual cues. This process could impair genital and sexual responding, intensify pain and trigger avoidance, which in turn may contribute to the onset and persistence of symptoms in vaginismus and to certain extent also in dyspareunia. To investigate whether women suffering from vaginismus are characterized by heightened levels of habitual pain catastrophic cognitions, together with higher levels of HA. This study consisted of three groups: a lifelong vaginismus group (N = 35, mean age = 28.4; standard deviation [SD] = 5.8), a dyspareunia group (N = 33, mean age = 26.7; SD = 6.8), and women without sexual complaints (N = 54, mean age = 26.5; SD = 6.7). HA scale of Cloninger's tridimensional personality questionnaire, and the pain catastrophizing scale. Specifically women inflicted with vaginismus showed significantly heightened levels of catastrophic pain cognitions compared with the other two groups, as well as significant enhanced HA vs. the control group, and a trend vs. the dyspareunia group. Both traits were shown to have cumulative predictive validity for the presence of vaginismus. This study focused on the personality traits of catastrophizing pain cognitions and HA in women with lifelong vaginismus. Our findings showed that indeed, women suffering from vaginismus are characterized by trait of HA interwoven with habitual pain catastrophizing cognitions. This study could help in the refinement of the current conceptualization and might shed

  7. Socioeconomic inequality in catastrophic health expenditure in Brazil.

    Science.gov (United States)

    Boing, Alexandra Crispim; Bertoldi, Andréa Dâmaso; Barros, Aluísio Jardim Dornellas de; Posenato, Leila Garcia; Peres, Karen Glazer

    2014-08-01

    To analyze the evolution of catastrophic health expenditure and the inequalities in such expenses, according to the socioeconomic characteristics of Brazilian families. Data from the National Household Budget 2002-2003 (48,470 households) and 2008-2009 (55,970 households) were analyzed. Catastrophic health expenditure was defined as excess expenditure, considering different methods of calculation: 10.0% and 20.0% of total consumption and 40.0% of the family's capacity to pay. The National Economic Indicator and schooling were considered as socioeconomic characteristics. Inequality measures utilized were the relative difference between rates, the rates ratio, and concentration index. The catastrophic health expenditure varied between 0.7% and 21.0%, depending on the calculation method. The lowest prevalences were noted in relation to the capacity to pay, while the highest, in relation to total consumption. The prevalence of catastrophic health expenditure increased by 25.0% from 2002-2003 to 2008-2009 when the cutoff point of 20.0% relating to the total consumption was considered and by 100% when 40.0% or more of the capacity to pay was applied as the cut-off point. Socioeconomic inequalities in the catastrophic health expenditure in Brazil between 2002-2003 and 2008-2009 increased significantly, becoming 5.20 times higher among the poorest and 4.17 times higher among the least educated. There was an increase in catastrophic health expenditure among Brazilian families, principally among the poorest and those headed by the least-educated individuals, contributing to an increase in social inequality.

  8. Understanding catastrophizing from a misdirected problem-solving perspective.

    Science.gov (United States)

    Flink, Ida K; Boersma, Katja; MacDonald, Shane; Linton, Steven J

    2012-05-01

    The aim is to explore pain catastrophizing from a problem-solving perspective. The links between catastrophizing, problem framing, and problem-solving behaviour are examined through two possible models of mediation as inferred by two contemporary and complementary theoretical models, the misdirected problem solving model (Eccleston & Crombez, 2007) and the fear-anxiety-avoidance model (Asmundson, Norton, & Vlaeyen, 2004). In this prospective study, a general population sample (n= 173) with perceived problems with spinal pain filled out questionnaires twice; catastrophizing and problem framing were assessed on the first occasion and health care seeking (as a proxy for medically oriented problem solving) was assessed 7 months later. Two different approaches were used to explore whether the data supported any of the proposed models of mediation. First, multiple regressions were used according to traditional recommendations for mediation analyses. Second, a bootstrapping method (n= 1000 bootstrap resamples) was used to explore the significance of the indirect effects in both possible models of mediation. The results verified the concepts included in the misdirected problem solving model. However, the direction of the relations was more in line with the fear-anxiety-avoidance model. More specifically, the mediation analyses provided support for viewing catastrophizing as a mediator of the relation between biomedical problem framing and medically oriented problem-solving behaviour. These findings provide support for viewing catastrophizing from a problem-solving perspective and imply a need to examine and address problem framing and catastrophizing in back pain patients. ©2011 The British Psychological Society.

  9. Nuclear war and other catastrophes. Civil and catastrophe protection in the Federal republic of Germany and the United Kingdom after 1945

    International Nuclear Information System (INIS)

    Diebel, Martin

    2017-01-01

    The book civil and catastrophe protection in the Federal republic of Germany and the United Kingdom after 1945 discusses the following issues: aerial defense and the atomic bomb (1945 - 1968), crises and catastrophes in the shadow of the bomb (1962 - 1978), civil defense and the comeback of the (nuclear) war (1976 - 1979), civil defense and the second ''Cold War'' (1979 - 1986), Chernobyl and the end of the Cold War (1979 - 1990), war, catastrophe and safety in the 20th century - a conclusion.

  10. muView: A Visual Analysis System for Exploring Uncertainty in Myocardial Ischemia Simulations

    KAUST Repository

    Rosen, Paul; Burton, Brett; Potter, Kristin; Johnson, Chris R.

    2016-01-01

    In this paper we describe the Myocardial Uncertainty Viewer (muView or μView) system for exploring data stemming from the simulation of cardiac ischemia. The simulation uses a collection of conductivity values to understand how ischemic regions effect the undamaged anisotropic heart tissue. The data resulting from the simulation is multi-valued and volumetric, and thus, for every data point, we have a collection of samples describing cardiac electrical properties. μView combines a suite of visual analysis methods to explore the area surrounding the ischemic zone and identify how perturbations of variables change the propagation of their effects. In addition to presenting a collection of visualization techniques, which individually highlight different aspects of the data, the coordinated view system forms a cohesive environment for exploring the simulations. We also discuss the findings of our study, which are helping to steer further development of the simulation and strengthening our collaboration with the biomedical engineers attempting to understand the phenomenon.

  11. muView: A Visual Analysis System for Exploring Uncertainty in Myocardial Ischemia Simulations

    KAUST Repository

    Rosen, Paul

    2016-05-23

    In this paper we describe the Myocardial Uncertainty Viewer (muView or μView) system for exploring data stemming from the simulation of cardiac ischemia. The simulation uses a collection of conductivity values to understand how ischemic regions effect the undamaged anisotropic heart tissue. The data resulting from the simulation is multi-valued and volumetric, and thus, for every data point, we have a collection of samples describing cardiac electrical properties. μView combines a suite of visual analysis methods to explore the area surrounding the ischemic zone and identify how perturbations of variables change the propagation of their effects. In addition to presenting a collection of visualization techniques, which individually highlight different aspects of the data, the coordinated view system forms a cohesive environment for exploring the simulations. We also discuss the findings of our study, which are helping to steer further development of the simulation and strengthening our collaboration with the biomedical engineers attempting to understand the phenomenon.

  12. Clinical Pain Catastrophizing in Women With Migraine and Obesity.

    Science.gov (United States)

    Bond, Dale S; Buse, Dawn C; Lipton, Richard B; Thomas, J Graham; Rathier, Lucille; Roth, Julie; Pavlovic, Jelena M; Evans, E Whitney; Wing, Rena R

    2015-01-01

    Obesity is related to migraine. Maladaptive pain coping strategies (eg, pain catastrophizing) may provide insight into this relationship. In women with migraine and obesity, we cross-sectionally assessed: (1) prevalence of clinical catastrophizing; (2) characteristics of those with and without clinical catastrophizing; and (3) associations of catastrophizing with headache features. Obese women migraineurs seeking weight loss treatment (n = 105) recorded daily migraine activity for 1 month via smartphone and completed the Pain Catastrophizing Scale (PCS). Clinical catastrophizing was defined as total PCS score ≥30. The six-item Headache Impact Test (HIT-6), 12-item Allodynia Symptom Checklist (ASC-12), Headache Management Self-Efficacy Scale (HMSE), and assessments for depression (Centers for Epidemiologic Studies Depression Scale) and anxiety (seven-item Generalized Anxiety Disorder Scale) were also administered. Using PCS scores and body mass index (BMI) as predictors in linear regression, we modeled a series of headache features (ie, headache days, HIT-6, etc) as outcomes. One quarter (25.7%; 95% confidence interval [CI] = 17.2-34.1%) of participants met criteria for clinical catastrophizing: they had higher BMI (37.9 ± 7.5 vs 34.4 ± 5.7 kg/m(2) , P = .035); longer migraine attack duration (160.8 ± 145.0 vs 97.5 ± 75.2 hours/month, P = .038); higher HIT-6 scores (68.7 ± 4.6 vs 64.5 ± 3.9, P duration (β = 0.390, P duration, higher pain sensitivity, greater headache impact, and lower headache management self-efficacy. In all participants, PCS scores were related to several migraine characteristics, above and beyond the effects of obesity. Prospective studies are needed to determine sequence and mechanisms of relationships between catastrophizing, obesity, and migraine. © 2015 American Headache Society.

  13. Catastrophic events and older adults.

    Science.gov (United States)

    Cloyd, Elizabeth; Dyer, Carmel B

    2010-12-01

    The plight of older adults during catastrophic events is a societal concern. Older persons have an increased prevalence of cognitive disorders, chronic illnesses, and mobility problems that limit their ability to cope. These disorders may result in a lack of mental capacity and the ability to discern when they should evacuate or resolve problems encountered during a catastrophe. Some older persons may have limited transportation options, and many of the elderly survivors are at increased risk for abuse, neglect, and exploitation. Recommendations for future catastrophic events include the development of a federal tracking system for elders and other vulnerable adults, the designation of separate shelter areas for elders and other vulnerable adults, and involvement of gerontological professionals in all aspects of emergency preparedness and care delivery, including training of frontline workers. Preparation through preevent planning that includes region-specific social services, medical and public health resources, volunteers, and facilities for elders and vulnerable adults is critical. Elders need to be protected from abuse and fraud during catastrophic events. A public health triage system for elders and other vulnerable populations in pre- and postdisaster situations is useful, and disaster preparedness is paramount. Communities and members of safety and rescue teams must address ethical issues before an event. When older adults are involved, consideration needs to be given to triage decision making, transporting those who are immobile, the care of older adults who receive palliative care, and the equitable distribution of resources. Nurses are perfectly equipped with the skills, knowledge, and training needed to plan and implement disaster preparedness programs. In keeping with the tradition of Florence Nightingale, nurses can assume several crucial roles in disaster preparedness for older adults. Nurses possess the ability to participate and lead community

  14. Empirical Bayes Credibility Models for Economic Catastrophic Losses by Regions

    Directory of Open Access Journals (Sweden)

    Jindrová Pavla

    2017-01-01

    Full Text Available Catastrophic events affect various regions of the world with increasing frequency and intensity. The number of catastrophic events and the amount of economic losses is varying in different world regions. Part of these losses is covered by insurance. Catastrophe events in last years are associated with increases in premiums for some lines of business. The article focus on estimating the amount of net premiums that would be needed to cover the total or insured catastrophic losses in different world regions using Bühlmann and Bühlmann-Straub empirical credibility models based on data from Sigma Swiss Re 2010-2016. The empirical credibility models have been developed to estimate insurance premiums for short term insurance contracts using two ingredients: past data from the risk itself and collateral data from other sources considered to be relevant. In this article we deal with application of these models based on the real data about number of catastrophic events and about the total economic and insured catastrophe losses in seven regions of the world in time period 2009-2015. Estimated credible premiums by world regions provide information how much money in the monitored regions will be need to cover total and insured catastrophic losses in next year.

  15. Catastrophe Theory and Caustics

    DEFF Research Database (Denmark)

    Gravesen, Jens

    1983-01-01

    It is shown by elementary methods that in codimension two and under the assumption that light rays are straight lines, a caustic is the catastrophe set for a time function. The general case is also discussed....

  16. Calibration and validation of earthquake catastrophe models. Case study: Impact Forecasting Earthquake Model for Algeria

    Science.gov (United States)

    Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.

    2012-04-01

    Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the

  17. A non-linear and stochastic response surface method for Bayesian estimation of uncertainty in soil moisture simulation from a land surface model

    Directory of Open Access Journals (Sweden)

    F. Hossain

    2004-01-01

    Full Text Available This study presents a simple and efficient scheme for Bayesian estimation of uncertainty in soil moisture simulation by a Land Surface Model (LSM. The scheme is assessed within a Monte Carlo (MC simulation framework based on the Generalized Likelihood Uncertainty Estimation (GLUE methodology. A primary limitation of using the GLUE method is the prohibitive computational burden imposed by uniform random sampling of the model's parameter distributions. Sampling is improved in the proposed scheme by stochastic modeling of the parameters' response surface that recognizes the non-linear deterministic behavior between soil moisture and land surface parameters. Uncertainty in soil moisture simulation (model output is approximated through a Hermite polynomial chaos expansion of normal random variables that represent the model's parameter (model input uncertainty. The unknown coefficients of the polynomial are calculated using limited number of model simulation runs. The calibrated polynomial is then used as a fast-running proxy to the slower-running LSM to predict the degree of representativeness of a randomly sampled model parameter set. An evaluation of the scheme's efficiency in sampling is made through comparison with the fully random MC sampling (the norm for GLUE and the nearest-neighborhood sampling technique. The scheme was able to reduce computational burden of random MC sampling for GLUE in the ranges of 10%-70%. The scheme was also found to be about 10% more efficient than the nearest-neighborhood sampling method in predicting a sampled parameter set's degree of representativeness. The GLUE based on the proposed sampling scheme did not alter the essential features of the uncertainty structure in soil moisture simulation. The scheme can potentially make GLUE uncertainty estimation for any LSM more efficient as it does not impose any additional structural or distributional assumptions.

  18. The catastrophic antiphospholipid syndrome in children.

    Science.gov (United States)

    Go, Ellen J L; O'Neil, Kathleen M

    2017-09-01

    To review the difficult syndrome of catastrophic antiphospholipid syndrome, emphasizing new developments in the diagnosis, pathogenesis and treatment. Few recent publications directly address pediatric catastrophic antiphospholipid syndrome (CAPS). Most articles are case reports or are data from adult and pediatric registries. The major factors contributing to most pediatric catastrophic antiphospholipid syndrome include infection and the presence of antiphospholipid antibodies, but complement activation also is important in creating diffuse thrombosis in the microcirculation. Treatment of the acute emergency requires anticoagulation, suppression of the hyperinflammatory state and elimination of the triggering infection. Inhibition of complement activation appears to improve outcome in limited studies, and suppression of antiphospholipid antibody formation may be important in long-term management. CAPS, an antibody-mediated diffuse thrombotic disease of microvasculature, is rare in childhood but has high mortality (33-50%). It requires prompt recognition and aggressive multimodality treatment, including anticoagulation, anti-inflammatory therapy and elimination of inciting infection and pathogenic autoantibodies.

  19. Orthogonality catastrophe and fractional exclusion statistics

    Science.gov (United States)

    Ares, Filiberto; Gupta, Kumar S.; de Queiroz, Amilcar R.

    2018-02-01

    We show that the N -particle Sutherland model with inverse-square and harmonic interactions exhibits orthogonality catastrophe. For a fixed value of the harmonic coupling, the overlap of the N -body ground state wave functions with two different values of the inverse-square interaction term goes to zero in the thermodynamic limit. When the two values of the inverse-square coupling differ by an infinitesimal amount, the wave function overlap shows an exponential suppression. This is qualitatively different from the usual power law suppression observed in the Anderson's orthogonality catastrophe. We also obtain an analytic expression for the wave function overlaps for an arbitrary set of couplings, whose properties are analyzed numerically. The quasiparticles constituting the ground state wave functions of the Sutherland model are known to obey fractional exclusion statistics. Our analysis indicates that the orthogonality catastrophe may be valid in systems with more general kinds of statistics than just the fermionic type.

  20. Derivative-free optimization under uncertainty applied to costly simulators

    International Nuclear Information System (INIS)

    Pauwels, Benoit

    2016-01-01

    The modeling of complex phenomena encountered in industrial issues can lead to the study of numerical simulation codes. These simulators may require extensive execution time (from hours to days), involve uncertain parameters and even be intrinsically stochastic. Importantly within the context of simulation-based optimization, the derivatives of the outputs with respect to the inputs may be inexistent, inaccessible or too costly to approximate reasonably. This thesis is organized in four chapters. The first chapter discusses the state of the art in derivative-free optimization and uncertainty modeling. The next three chapters introduce three independent - although connected - contributions to the field of derivative-free optimization in the presence of uncertainty. The second chapter addresses the emulation of costly stochastic simulation codes - stochastic in the sense simulations run with the same input parameters may lead to distinct outputs. Such was the matter of the CODESTOCH project carried out at the Summer mathematical research center on scientific computing and its applications (CEMRACS) during the summer of 2013, together with two Ph.D. students from Electricity of France (EDF) and the Atomic Energy and Alternative Energies Commission (CEA). We designed four methods to build emulators for functions whose values are probability density functions. These methods were tested on two toy functions and applied to industrial simulation codes concerned with three complex phenomena: the spatial distribution of molecules in a hydrocarbon system (IFPEN), the life cycle of large electric transformers (EDF) and the repercussions of a hypothetical accidental in a nuclear plant (CEA). Emulation was a preliminary process towards optimization in the first two cases. In the third chapter we consider the influence of inaccurate objective function evaluations on direct search - a classical derivative-free optimization method. In real settings inaccuracy may never vanish

  1. Paraboles et catastrophes

    CERN Document Server

    Thom, René

    1983-01-01

    René Thom, mathématicien français, membre de l'Académie des Sciences, s'est vu décerner en 1958 la médaille Field, équivalent du Prix Nobel en mathématiques, pour ses créations intellectuelles, la " théorie des catastrophes ", regard nouveau sur toutes les transformations qui adviennent de manière brusque, imprévisible, dramatique. Dans ces entretiens qui vont de la mathématique à l'embryologie, de la linguistique à l'anthropologie et à l'histoire, René Thom expose les grandes lignes de la théorie des catastrophes et passe en revue, avec un esprit à la fois critique et passionné, les grands thèmes scientifiques de notre époque, de la physique atomique à la biologie moléculaire, du " progrès " scientifique et technologique aux connexions complexes entre la société et la science. " Ce petit livre est une extraordinaire réussite en vulgarisation ". (Jean Largeault)

  2. Catastrophe, Gender and Urban Experience, 1648–1920

    DEFF Research Database (Denmark)

    Employing a broad definition of catastrophe, this book examines how urban communities conceived, adapted to and were transformed by catastrophes. Competing views of gender figure in the telling and retelling of these trag- edies, which are mediated by myth and memory. This is a nuanced account...

  3. Catastrophizing Interferes with Cognitive Modulation of Pain in Women with Fibromyalgia.

    Science.gov (United States)

    Ellingson, Laura D; Stegner, Aaron J; Schwabacher, Isaac J; Lindheimer, Jacob B; Cook, Dane B

    2018-02-21

    Pain modulation is a critical function of the nociceptive system that includes the ability to engage descending pain control systems to maintain a functional balance between facilitation and inhibition of incoming sensory stimuli. Dysfunctional pain modulation is associated with increased risk for chronic pain and is characteristic of fibromyalgia (FM). Catastrophizing is also common in FM. However, its influence on pain modulation is poorly understood. To determine the role of catastrophizing on central nervous system processing during pain modulation in FM via examining brain responses and pain sensitivity during an attention-distraction paradigm. Twenty FM patients and 18 healthy controls (CO) underwent functional magnetic resonance imaging while receiving pain stimuli, administered alone and during distracting cognitive tasks. Pain ratings were assessed after each stimulus. Catastrophizing was assessed with the Pain Catastrophizing Scale (PCS). The ability to modulate pain during distraction varied among FM patients and was associated with catastrophizing. This was demonstrated by significant positive relationships between PCS scores and pain ratings (P modulation did not differ between FM and CO (P > 0.05). FM patients with higher levels of catastrophizing were less able to distract themselves from pain, indicative of catastrophizing-related impairments in pain modulation. These results suggest that the tendency to catastrophize interacts with attention-resource allocation and may represent a mechanism of chronic pain exacerbation and/or maintenance. Reducing catastrophizing may improve FM symptoms via improving central nervous system regulation of pain.

  4. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    into account the main error sources for the measurement. This method has the potential to deal with all kinds of systematic and random errors that influence a dimensional CT measurement. A case study demonstrates the practical application of the VCT simulator using numerically generated CT data and statistical......The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...

  5. Uncertainty and Sensitivity of Neutron Kinetic Parameters in the Dynamic Response of a PWR Rod Ejection Accident Coupled Simulation

    Directory of Open Access Journals (Sweden)

    C. Mesado

    2012-01-01

    Full Text Available In nuclear safety analysis, it is very important to be able to simulate the different transients that can occur in a nuclear power plant with a very high accuracy. Although the best estimate codes can simulate the transients and provide realistic system responses, the use of nonexact models, together with assumptions and estimations, is a source of uncertainties which must be properly evaluated. This paper describes a Rod Ejection Accident (REA simulated using the coupled code RELAP5/PARCSv2.7 with a perturbation on the cross-sectional sets in order to determine the uncertainties in the macroscopic neutronic information. The procedure to perform the uncertainty and sensitivity (U&S analysis is a sampling-based method which is easy to implement and allows different procedures for the sensitivity analyses despite its high computational time. DAKOTA-Jaguar software package is the selected toolkit for the U&S analysis presented in this paper. The size of the sampling is determined by applying the Wilks’ formula for double tolerance limits with a 95% of uncertainty and with 95% of statistical confidence for the output variables. Each sample has a corresponding set of perturbations that will modify the cross-sectional sets used by PARCS. Finally, the intervals of tolerance of the output variables will be obtained by the use of nonparametric statistical methods.

  6. Uncertainty in reactive transport geochemical modelling

    International Nuclear Information System (INIS)

    Oedegaard-Jensen, A.; Ekberg, C.

    2005-01-01

    Full text of publication follows: Geochemical modelling is one way of predicting the transport of i.e. radionuclides in a rock formation. In a rock formation there will be fractures in which water and dissolved species can be transported. The composition of the water and the rock can either increase or decrease the mobility of the transported entities. When doing simulations on the mobility or transport of different species one has to know the exact water composition, the exact flow rates in the fracture and in the surrounding rock, the porosity and which minerals the rock is composed of. The problem with simulations on rocks is that the rock itself it not uniform i.e. larger fractures in some areas and smaller in other areas which can give different water flows. The rock composition can be different in different areas. In additions to this variance in the rock there are also problems with measuring the physical parameters used in a simulation. All measurements will perturb the rock and this perturbation will results in more or less correct values of the interesting parameters. The analytical methods used are also encumbered with uncertainties which in this case are added to the uncertainty from the perturbation of the analysed parameters. When doing simulation the effect of the uncertainties must be taken into account. As the computers are getting faster and faster the complexity of simulated systems are increased which also increase the uncertainty in the results from the simulations. In this paper we will show how the uncertainty in the different parameters will effect the solubility and mobility of different species. Small uncertainties in the input parameters can result in large uncertainties in the end. (authors)

  7. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    Science.gov (United States)

    Wang, A.; Moore, J.C.; Cui, Xingquan; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D.M.; McGuire, A.D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-01-01

     We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135  ×  104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101  × 104 km2). However the uncertainty (1 to 128  ×  104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future

  8. Pragmatic aspects of uncertainty propagation: A conceptual review

    KAUST Repository

    Thacker, W.Carlisle; Iskandarani, Mohamad; Gonç alves, Rafael C.; Srinivasan, Ashwanth; Knio, Omar

    2015-01-01

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  9. Pragmatic aspects of uncertainty propagation: A conceptual review

    KAUST Repository

    Thacker, W.Carlisle

    2015-09-11

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  10. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  11. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    Science.gov (United States)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  12. Incorporating parametric uncertainty into population viability analysis models

    Science.gov (United States)

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  13. Task Force on Catastrophic Antiphospholipid Syndrome (APS) and Non-criteria APS Manifestations (I): catastrophic APS, APS nephropathy and heart valve lesions.

    Science.gov (United States)

    Cervera, R; Tektonidou, M G; Espinosa, G; Cabral, A R; González, E B; Erkan, D; Vadya, S; Adrogué, H E; Solomon, M; Zandman-Goddard, G; Shoenfeld, Y

    2011-02-01

    The objectives of the 'Task Force on Catastrophic Antiphospholipid Syndrome (APS) and Non-criteria APS Manifestations' were to assess the clinical utility of the international consensus statement on classification criteria and treatment guidelines for the catastrophic APS, to identify and grade the studies that analyse the relationship between the antiphospholipid antibodies and the non-criteria APS manifestations and to present the current evidence regarding the accuracy of these non-criteria APS manifestations for the detection of patients with APS. This article summarizes the studies analysed on the catastrophic APS, APS nephropathy and heart valve lesions, and presents the recommendations elaborated by the Task Force after this analysis.

  14. Inverse uncertainty quantification of reactor simulations under the Bayesian framework using surrogate models constructed by polynomial chaos expansion

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Xu, E-mail: xuwu2@illinois.edu; Kozlowski, Tomasz

    2017-03-15

    Modeling and simulations are naturally augmented by extensive Uncertainty Quantification (UQ) and sensitivity analysis requirements in the nuclear reactor system design, in which uncertainties must be quantified in order to prove that the investigated design stays within acceptance criteria. Historically, expert judgment has been used to specify the nominal values, probability density functions and upper and lower bounds of the simulation code random input parameters for the forward UQ process. The purpose of this paper is to replace such ad-hoc expert judgment of the statistical properties of input model parameters with inverse UQ process. Inverse UQ seeks statistical descriptions of the model random input parameters that are consistent with the experimental data. Bayesian analysis is used to establish the inverse UQ problems based on experimental data, with systematic and rigorously derived surrogate models based on Polynomial Chaos Expansion (PCE). The methods developed here are demonstrated with the Point Reactor Kinetics Equation (PRKE) coupled with lumped parameter thermal-hydraulics feedback model. Three input parameters, external reactivity, Doppler reactivity coefficient and coolant temperature coefficient are modeled as uncertain input parameters. Their uncertainties are inversely quantified based on synthetic experimental data. Compared with the direct numerical simulation, surrogate model by PC expansion shows high efficiency and accuracy. In addition, inverse UQ with Bayesian analysis can calibrate the random input parameters such that the simulation results are in a better agreement with the experimental data.

  15. Catastrophic risks and insurance in farm-level decision making

    NARCIS (Netherlands)

    Ogurtsov, V.

    2008-01-01

    Keywords: risk perception, risk attitude, catastrophic risk, insurance, farm characteristics, farmer personal characteristics, utility-efficient programming, arable farming, dairy farming

    Catastrophic risks can cause severe cash flow problems for farmers or even result into their

  16. A framework to quantify uncertainty in simulations of oil transport in the ocean

    KAUST Repository

    Gonçalves, Rafael C.

    2016-03-02

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model\\'s output to be presented in a probabilistic framework so that the model\\'s predictions reflect the uncertainty in the model\\'s input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model\\'s uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.

  17. A framework to quantify uncertainty in simulations of oil transport in the ocean

    KAUST Repository

    Gonç alves, Rafael C.; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Chassignet, Eric; Knio, Omar

    2016-01-01

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model's output to be presented in a probabilistic framework so that the model's predictions reflect the uncertainty in the model's input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model's uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.

  18. Phenomena-based Uncertainty Quantification in Predictive Coupled- Physics Reactor Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Marvin [Texas A & M Univ., College Station, TX (United States)

    2017-06-12

    This project has sought to develop methodologies, tailored to phenomena that govern nuclearreactor behavior, to produce predictions (including uncertainties) for quantities of interest (QOIs) in the simulation of steady-state and transient reactor behavior. Examples of such predictions include, for each QOI, an expected value as well as a distribution around this value and an assessment of how much of the distribution stems from each major source of uncertainty. The project has sought to test its methodologies by comparing against measured experimental outcomes. The main experimental platform has been a 1-MW TRIGA reactor. This is a flexible platform for a wide range of experiments, including steady state with and without temperature feedback, slow transients with and without feedback, and rapid transients with strong feedback. The original plan was for the primary experimental data to come from in-core neutron detectors. We made considerable progress toward this goal but did not get as far along as we had planned. We have designed, developed, installed, and tested vertical guide tubes, each able to accept a detector or stack of detectors that can be moved axially inside the tube, and we have tested several new detector designs. One of these shows considerable promise.

  19. Phenomena-based Uncertainty Quantification in Predictive Coupled- Physics Reactor Simulations

    International Nuclear Information System (INIS)

    Adams, Marvin

    2017-01-01

    This project has sought to develop methodologies, tailored to phenomena that govern nuclearreactor behavior, to produce predictions (including uncertainties) for quantities of interest (QOIs) in the simulation of steady-state and transient reactor behavior. Examples of such predictions include, for each QOI, an expected value as well as a distribution around this value and an assessment of how much of the distribution stems from each major source of uncertainty. The project has sought to test its methodologies by comparing against measured experimental outcomes. The main experimental platform has been a 1-MW TRIGA reactor. This is a flexible platform for a wide range of experiments, including steady state with and without temperature feedback, slow transients with and without feedback, and rapid transients with strong feedback. The original plan was for the primary experimental data to come from in-core neutron detectors. We made considerable progress toward this goal but did not get as far along as we had planned. We have designed, developed, installed, and tested vertical guide tubes, each able to accept a detector or stack of detectors that can be moved axially inside the tube, and we have tested several new detector designs. One of these shows considerable promise.

  20. Predicting catastrophes of non-autonomous networks with visibility graphs and horizontal visibility

    Science.gov (United States)

    Zhang, Haicheng; Xu, Daolin; Wu, Yousheng

    2018-05-01

    Prediction of potential catastrophes in engineering systems is a challenging problem. We first attempt to construct a complex network to predict catastrophes of a multi-modular floating system in advance of their occurrences. Response time series of the system can be mapped into an virtual network by using visibility graph or horizontal visibility algorithm. The topology characteristics of the networks can be used to forecast catastrophes of the system. Numerical results show that there is an obvious corresponding relationship between the variation of topology characteristics and the onset of catastrophes. A Catastrophe Index (CI) is proposed as a numerical indicator to measure a qualitative change from a stable state to a catastrophic state. The two approaches, the visibility graph and horizontal visibility algorithms, are compared by using the index in the reliability analysis with different data lengths and sampling frequencies. The technique of virtual network method is potentially extendable to catastrophe predictions of other engineering systems.

  1. The use of Monte-Carlo simulation and order statistics for uncertainty analysis of a LBLOCA transient (LOFT-L2-5)

    International Nuclear Information System (INIS)

    Chojnacki, E.; Benoit, J.P.

    2007-01-01

    Best estimate computer codes are increasingly used in nuclear industry for the accident management procedures and have been planned to be used for the licensing procedures. Contrary to conservative codes which are supposed to give penalizing results, best estimate codes attempt to calculate accidental transients in a realistic way. It becomes therefore of prime importance, in particular for technical organization as IRSN in charge of safety assessment, to know the uncertainty on the results of such codes. Thus, CSNI has sponsored few years ago (published in 1998) the Uncertainty Methods Study (UMS) program on uncertainty methodologies used for a SBLOCA transient (LSTF-CL-18) and is now supporting the BEMUSE program for a LBLOCA transient (LOFT-L2-5). The large majority of BEMUSE participants (9 out of 10) use uncertainty methodologies based on a probabilistic modelling and all of them use Monte-Carlo simulations to propagate the uncertainties through their computer codes. Also, all of 'probabilistic participants' intend to use order statistics to determine the sampling size of the Monte-Carlo simulation and to derive the uncertainty ranges associated to their computer calculations. The first aim of this paper is to remind the advantages and also the assumptions of the probabilistic modelling and more specifically of order statistics (as Wilks' formula) in uncertainty methodologies. Indeed Monte-Carlo methods provide flexible and extremely powerful techniques for solving many of the uncertainty propagation problems encountered in nuclear safety analysis. However it is important to keep in mind that probabilistic methods are data intensive. That means, probabilistic methods cannot produce robust results unless a considerable body of information has been collected. A main interest of the use of order statistics results is to allow to take into account an unlimited number of uncertain parameters and, from a restricted number of code calculations to provide statistical

  2. Catastrophic avalanches and methods of their control

    Directory of Open Access Journals (Sweden)

    N. A. Volodicheva

    2014-01-01

    Full Text Available Definition of such phenomenon as “catastrophic avalanche” is presented in this arti-cle. Several situations with releases of catastrophic avalanches in mountains of Caucasus, Alps, and Central Asia are investigated. Materials of snow-avalanche ob-servations performed since 1960s at the Elbrus station of the Lomonosov Moscow State University (Central Caucasus were used for this work. Complex-valued measures of engineering protection demonstrating different efficiencies are consid-ered.

  3. Pricing catastrophic bonds for earthquakes in Mexico

    OpenAIRE

    Cabrera, Brenda López

    2006-01-01

    After the occurrence of a natural disaster, the reconstruction can be financed with catastrophic bonds (CAT bonds) or reinsurance. For insurers, reinsurers and other corporations CAT bonds provide multi year protection without the credit risk present in reinsurance. For investors CAT bonds offer attractive returns and reduction of portfolio risk, since CAT bonds defaults are uncorrelated with defaults of other securities. As the study of natural catastrophe models plays an important role in t...

  4. Stagewise cognitive development: an application of catastrophe theory.

    Science.gov (United States)

    van der Maas, H L; Molenaar, P C

    1992-07-01

    In this article an overview is given of traditional methodological approaches to stagewise cognitive developmental research. These approaches are evaluated and integrated on the basis of catastrophe theory. In particular, catastrophe theory specifies a set of common criteria for testing the discontinuity hypothesis proposed by Piaget. Separate criteria correspond to distinct methods used in cognitive developmental research. Such criteria are, for instance, the detection of spurts in development, bimodality of test scores, and increased variability of responses during transitional periods. When a genuine stage transition is present, these criteria are expected to be satisfied. A revised catastrophe model accommodating these criteria is proposed for the stage transition in cognitive development from the preoperational to the concrete operational stage.

  5. Should catastrophic risks be included in a regulated competitive health insurance market?

    Science.gov (United States)

    van de Ven, W P; Schut, F T

    1994-11-01

    In 1988 the Dutch government launched a proposal for a national health insurance based on regulated competition. The mandatory benefits package should be offered by competing insurers and should cover both non-catastrophic risks (like hospital care, physician services and drugs) and catastrophic risks (like several forms of expensive long-term care). However, there are two arguments to exclude some of the catastrophic risks from the competitive insurance market, at least during the implementation process of the reforms. Firstly, the prospects for a workable system of risk-adjusted payments to the insurers that should take away the incentives for cream skimming are, at least during the next 5 years, more favorable for the non-catastrophic risks than for the catastrophic risks. Secondly, even if a workable system of risk-adjusted payments can be developed, the problem of quality skimping may be relevant for some of the catastrophic risks, but not for non-catastrophic risks. By 'quality skimping' we mean the reduction of the quality of care to a level which is below the minimum level that is acceptable to society. After 5 years of health care reforms in the Netherlands new insights have resulted in a growing support to confine the implementation of the reforms to the non-catastrophic risks. In drawing (and redrawing) the exact boundaries between different regulatory regimes for catastrophic and non-catastrophic risks, the expected benefits of a cost-effective substitution of care have to be weighted against the potential harm caused by cream skimming and quality skimping.

  6. Indirect Catastrophic Injuries in Olympic Styles of Wrestling in Iran

    OpenAIRE

    Kordi, Ramin; Ziaee, Vahid; Rostami, Mohsen; Wallace, W. Angus

    2011-01-01

    Background: Data on indirect catastrophic injuries in wrestling are scarce. Objectives: To develop a profile of indirect catastrophic injuries in international styles of wrestling and to describe possible risk factors. Study Design: Retrospective case series; Level of evidence, 3. Methods: Indirect catastrophic injuries that occurred in wrestling clubs in Iran from July 1998 to June 2005 were identified by contacting several sources. The cases were retrospectively reviewed. Results: The injur...

  7. Navigating catastrophes: Local but not global optimisation allows for macro-economic navigation of crises

    Science.gov (United States)

    Harré, Michael S.

    2013-02-01

    Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.

  8. Reactor accidents and nuclear catastrophes

    International Nuclear Information System (INIS)

    Kirchhoff, R.; Linde, H.J.

    1979-01-01

    Assuming some preliminary knowledge of the fundamentals of atomic physics, the book describes the effects of ionizing radiation on the human organism. In order to assess the potential hazards of reactor accidents and the extent of a nuclear catastrophe, the technology of power generation in nuclear power stations is presented together with its potential dangers as well as the physical and medical processes occurring during a nuclear weapons explosion. The special medical aspects are presented which range from first aid in the case of a catastrophe to the accute radiation syndrome, the treatment of burns to the therapy of late radiolesions. Finally, it is confirmed that the treatment of radiation injured persons does not give rise to basically new medical problems. (orig./HP) [de

  9. Valuation of Indonesian catastrophic earthquake bonds with generalized extreme value (GEV) distribution and Cox-Ingersoll-Ross (CIR) interest rate model

    Science.gov (United States)

    Gunardi, Setiawan, Ezra Putranda

    2015-12-01

    Indonesia is a country with high risk of earthquake, because of its position in the border of earth's tectonic plate. An earthquake could raise very high amount of damage, loss, and other economic impacts. So, Indonesia needs a mechanism for transferring the risk of earthquake from the government or the (reinsurance) company, as it could collect enough money for implementing the rehabilitation and reconstruction program. One of the mechanisms is by issuing catastrophe bond, `act-of-God bond', or simply CAT bond. A catastrophe bond issued by a special-purpose-vehicle (SPV) company, and then sold to the investor. The revenue from this transaction is joined with the money (premium) from the sponsor company and then invested in other product. If a catastrophe happened before the time-of-maturity, cash flow from the SPV to the investor will discounted or stopped, and the cash flow is paid to the sponsor company to compensate their loss because of this catastrophe event. When we consider the earthquake only, the amount of discounted cash flow could determine based on the earthquake's magnitude. A case study with Indonesian earthquake magnitude data show that the probability of maximum magnitude can model by generalized extreme value (GEV) distribution. In pricing this catastrophe bond, we assumed stochastic interest rate that following the Cox-Ingersoll-Ross (CIR) interest rate model. We develop formulas for pricing three types of catastrophe bond, namely zero coupon bonds, `coupon only at risk' bond, and `principal and coupon at risk' bond. Relationship between price of the catastrophe bond and CIR model's parameter, GEV's parameter, percentage of coupon, and discounted cash flow rule then explained via Monte Carlo simulation.

  10. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    Science.gov (United States)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of

  11. Microtubule catastrophe and rescue.

    Science.gov (United States)

    Gardner, Melissa K; Zanic, Marija; Howard, Jonathon

    2013-02-01

    Microtubules are long cylindrical polymers composed of tubulin subunits. In cells, microtubules play an essential role in architecture and motility. For example, microtubules give shape to cells, serve as intracellular transport tracks, and act as key elements in important cellular structures such as axonemes and mitotic spindles. To accomplish these varied functions, networks of microtubules in cells are very dynamic, continuously remodeling through stochastic length fluctuations at the ends of individual microtubules. The dynamic behavior at the end of an individual microtubule is termed 'dynamic instability'. This behavior manifests itself by periods of persistent microtubule growth interrupted by occasional switching to rapid shrinkage (called microtubule 'catastrophe'), and then by switching back from shrinkage to growth (called microtubule 'rescue'). In this review, we summarize recent findings which provide new insights into the mechanisms of microtubule catastrophe and rescue, and discuss the impact of these findings in regards to the role of microtubule dynamics inside of cells. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Uncertainty in parameterisation and model structure affect simulation results in coupled ecohydrological models

    Directory of Open Access Journals (Sweden)

    S. Arnold

    2009-10-01

    Full Text Available In this paper we develop and apply a conceptual ecohydrological model to investigate the effects of model structure and parameter uncertainty on the simulation of vegetation structure and hydrological dynamics. The model is applied for a typical water limited riparian ecosystem along an ephemeral river: the middle section of the Kuiseb River in Namibia. We modelled this system by coupling an ecological model with a conceptual hydrological model. The hydrological model is storage based with stochastical forcing from the flood. The ecosystem is modelled with a population model, and represents three dominating riparian plant populations. In appreciation of uncertainty about population dynamics, we applied three model versions with increasing complexity. Population parameters were found by Latin hypercube sampling of the parameter space and with the constraint that three species should coexist as observed. Two of the three models were able to reproduce the observed coexistence. However, both models relied on different coexistence mechanisms, and reacted differently to change of long term memory in the flood forcing. The coexistence requirement strongly constrained the parameter space for both successful models. Only very few parameter sets (0.5% of 150 000 samples allowed for coexistence in a representative number of repeated simulations (at least 10 out of 100 and the success of the coexistence mechanism was controlled by the combination of population parameters. The ensemble statistics of average values of hydrologic variables like transpiration and depth to ground water were similar for both models, suggesting that they were mainly controlled by the applied hydrological model. The ensemble statistics of the fluctuations of depth to groundwater and transpiration, however, differed significantly, suggesting that they were controlled by the applied ecological model and coexistence mechanisms. Our study emphasizes that uncertainty about ecosystem

  13. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Science.gov (United States)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  14. Pain catastrophizing predicts verbal expression among children with chronic pain and their mothers

    Directory of Open Access Journals (Sweden)

    Shelby L Langer

    2016-03-01

    Full Text Available This study examined intra- and inter-personal associations between pain catastrophizing and verbal expression in 70 children with recurrent abdominal pain and their mothers. Participants independently completed the Pain Catastrophizing Scale. Mothers and children then talked about the child’s pain. Speech was categorized using a linguistic analysis program. Catastrophizing was positively associated with the use of negative emotion words by both mothers and children. In addition, mothers’ catastrophizing was positively associated with both mothers’ and children’s anger word usage, whereas children’s catastrophizing was inversely associated with mothers’ anger word usage. Findings extend the literature on behavioral and interpersonal aspects of catastrophizing.

  15. Multi-Scale Fusion of Information for Uncertainty Quantification and Management in Large-Scale Simulations

    Science.gov (United States)

    2015-12-02

    of completely new nonlinear Malliavin calculus . This type of calculus is important for the analysis and simulation of stationary and/or “causal...been limited by the fact that it requires the solution of an optimization problem with noisy gradients . When using deterministic optimization schemes...under uncertainty. We tested new developments on nonlinear Malliavin calculus , combining reduced basis methods with ANOVA, model validation, on

  16. Probabilistic Approach to Enable Extreme-Scale Simulations under Uncertainty and System Faults. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-05-05

    The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solution can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.

  17. Informal uncertainty analysis (GLUE) of continuous flow simulation in a hybrid sewer system with infiltration inflow - Consistency of containment ratios in calibration and validation?

    DEFF Research Database (Denmark)

    Breinholt, Anders; Grum, Morten; Madsen, Henrik

    2013-01-01

    to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction...... rain inputs and more accurate flow observations to reduce parameter and model simulation uncertainty. © Author(s) 2013....

  18. Validation and uncertainty quantification of Fuego simulations of calorimeter heating in a wind-driven hydrocarbon pool fire.

    Energy Technology Data Exchange (ETDEWEB)

    Domino, Stefan Paul; Figueroa, Victor G.; Romero, Vicente Jose; Glaze, David Jason; Sherman, Martin P.; Luketa-Hanlin, Anay Josephine

    2009-12-01

    The objective of this work is to perform an uncertainty quantification (UQ) and model validation analysis of simulations of tests in the cross-wind test facility (XTF) at Sandia National Laboratories. In these tests, a calorimeter was subjected to a fire and the thermal response was measured via thermocouples. The UQ and validation analysis pertains to the experimental and predicted thermal response of the calorimeter. The calculations were performed using Sierra/Fuego/Syrinx/Calore, an Advanced Simulation and Computing (ASC) code capable of predicting object thermal response to a fire environment. Based on the validation results at eight diversely representative TC locations on the calorimeter the predicted calorimeter temperatures effectively bound the experimental temperatures. This post-validates Sandia's first integrated use of fire modeling with thermal response modeling and associated uncertainty estimates in an abnormal-thermal QMU analysis.

  19. 'Performative narrativity': Palestinian identity and the performance of catastrophe

    NARCIS (Netherlands)

    Saloul, I.

    2008-01-01

    The day Israel annually celebrates as its "Day of Independence" Palestinians commemorate as their day of catastrophe (al-nakba). To most Palestinians, the catastrophic loss of Palestine in 1948 represents the climactic formative event of their lives. In the aftermath of this loss, the Palestinian

  20. Propagation of radar rainfall uncertainty in urban flood simulations

    Science.gov (United States)

    Liguori, Sara; Rico-Ramirez, Miguel

    2013-04-01

    hydrodynamic sewer network model implemented in the Infoworks software was used to model the rainfall-runoff process in the urban area. The software calculates the flow through the sewer conduits of the urban model using rainfall as the primary input. The sewer network is covered by 25 radar pixels with a spatial resolution of 1 km2. The majority of the sewer system is combined, carrying both urban rainfall runoff as well as domestic and trade waste water [11]. The urban model was configured to receive the probabilistic radar rainfall fields. The results showed that the radar rainfall ensembles provide additional information about the uncertainty in the radar rainfall measurements that can be propagated in urban flood modelling. The peaks of the measured flow hydrographs are often bounded within the uncertainty area produced by using the radar rainfall ensembles. This is in fact one of the benefits of using radar rainfall ensembles in urban flood modelling. More work needs to be done in improving the urban models, but this is out of the scope of this research. The rainfall uncertainty cannot explain the whole uncertainty shown in the flow simulations, and additional sources of uncertainty will come from the structure of the urban models as well as the large number of parameters required by these models. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and the UK Environment Agency for providing the various data sets. We also thank Yorkshire Water Services Ltd for providing the urban model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1. References [1] Browning KA, 1978. Meteorological applications of radar. Reports on Progress in Physics 41 761 Doi: 10.1088/0034-4885/41/5/003 [2] Rico-Ramirez MA, Cluckie ID, Shepherd G, Pallot A, 2007. A high-resolution radar experiment on the island of Jersey. Meteorological Applications 14: 117-129. [3] Villarini G, Krajewski WF

  1. The Catalan version of the Pain Catastrophizing Scale: a useful instrument to assess catastrophic thinking in whiplash patients.

    Science.gov (United States)

    Miró, Jordi; Nieto, Rubén; Huguet, Anna

    2008-05-01

    The main aims of this work were to test the psychometric properties of the Catalan version of the Pain Catastrophizing Scale (PCS) and to assess the usefulness of the scale when used with whiplash patients. This article reports results from 2 complementary studies. In the first one, the PCS was administered to 280 students and 146 chronic pain patients to examine the psychometric properties of a new Catalan version of the instrument. A confirmatory factor analysis supported a second-order structure, in which 3 second-order factors (ie, rumination, helplessness, and magnification) load in a higher-order factor (ie, catastrophizing). The reliability of the Catalan version was supported by an acceptable internal consistency and test-retest values. Validity was supported by the correlations found among the PCS and pain intensity, pain interference, and depression. The objective of the second study was to evaluate the PCS when used with whiplash patients. In this second study, 141 patients with whiplash disorders participated. In general, the psychometric properties of the PCS were found appropriate, with factor analysis supporting the structure described in patients with chronic pain. Our data suggest that the PCS is a good instrument to assess catastrophic thinking in whiplash patients. The usefulness of the PCS in whiplash disorders has been explored in this study. Results of our work show that the PCS can be a very useful tool to assess catastrophic thinking about pain in whiplash patients.

  2. Value at risk (VaR in uncertainty: Analysis with parametric method and black & scholes simulations

    Directory of Open Access Journals (Sweden)

    Humberto Banda Ortiz

    2014-07-01

    Full Text Available VaR is the most accepted risk measure worldwide and the leading reference in any risk management assessment. However, its methodology has important limitations which makes it unreliable in contexts of crisis or high uncertainty. For this reason, the aim of this work is to test the VaR accuracy when is employed in contexts of volatility, for which we compare the VaR outcomes in scenarios of both stability and uncertainty, using the parametric method and a historical simulation based on data generated with the Black & Scholes model. VaR main objective is the prediction of the highest expected loss for any given portfolio, but even when it is considered a useful tool for risk management under conditions of markets stability, we found that it is substantially inaccurate in contexts of crisis or high uncertainty. In addition, we found that the Black & Scholes simulations lead to underestimate the expected losses, in comparison with the parametric method and we also found that those disparities increase substantially in times of crisis. In the first section of this work we present a brief context of risk management in finance. In section II we present the existent literature relative to the VaR concept, its methods and applications. In section III we describe the methodology and assumptions used in this work. Section IV is dedicated to expose the findings. And finally, in Section V we present our conclusions.

  3. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  4. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  5. Make or buy decision considering uncertainty based on fuzzy logic using simulation and multiple criteria decision making

    Directory of Open Access Journals (Sweden)

    Ali Mohtashami

    2013-01-01

    Full Text Available Decision making on making/buying problem has always been a challenge to decision makers. In this paper a methodology has been proposed to resolve this challenge. This methodology is capable of evaluating making/buying decision making under uncertainty. For uncertainty, the fuzzy logic and simulation approaches have been used. The proposed methodology can be applied to parts with multi stage manufacturing processes and different suppliers. Therefore this methodology provides a scale for decision making from full outsourcing to full manufacturing and with selecting appropriate supplier.

  6. A critical look at catastrophe risk assessments

    CERN Document Server

    Kent, A

    2004-01-01

    Recent papers by Busza et al. (BJSW) and Dar et al. (DDH) argue that astrophysical data can be used to establish bounds on the risk of a catastrophe in forthcoming collider experiments. The safety case set out by BJSW does not rely on these bounds, but on theoretical arguments, which BJSW find sufficiently compelling. However, DDH and other commentators (initially including BJSW) have suggested that the astrophysical bounds alone do give sufficient reassurance. This seems unsupportable when the bounds are expressed in terms of expected cost. For example, DDH's main bound, $p_{\\rm catastrophe} < 2 \\times 10^{-8}$, implies only that the expectation value of the number of deaths is bounded by 120. We thus reappraise the DDH and BJSW risk bounds by comparing risk policy in other areas. We find that requiring a catastrophe risk of no higher than 10^{-15} is necessary to be consistent with established policy for risk optimisation from radiation hazards, even if highly risk tolerant assumptions are made. A respec...

  7. [Bioethics in catastrophe situations such as earthquakes].

    Science.gov (United States)

    León, C Francisco Javier

    2012-01-01

    A catastrophe of the magnitude of the earthquake and tsunami that hit Chile not long ago, forces us to raise some questions that we will try to answer from a philosophical, ethical and responsibility viewpoints. An analysis of the basic principles of bioethics is also justified. A natural catastrophe is not, by itself, moral or immoral, fair or unfair. However, its consequences could certainly be regarded as such, depending on whether they could have been prevented or mitigated. We will identify those individuals, who have the ethical responsibility to attend the victims and the ethical principles that must guide the tasks of healthcare and psychological support teams. The minimal indispensable actions to obtain an adequate social and legal protection of vulnerable people, must be defined according to international guidelines. These reflections are intended to improve the responsibility of the State and all the community, to efficiently prevent and repair the material and psychological consequences of such a catastrophe.

  8. Pricing the property claim service (PCS) catastrophe insurance options using gamma distribution

    Science.gov (United States)

    Noviyanti, Lienda; Soleh, Achmad Zanbar; Setyanto, Gatot R.

    2017-03-01

    The catastrophic events like earthquakes, hurricanes or flooding are characteristics for some areas, a properly calculated annual premium would be closely as high as the loss insured. From an actuarial perspective, such events constitute the risk that are not insurable. On the other hand people living in such areas need protection. In order to securitize the catastrophe risk, futures or options based on a loss index could be considered. Chicago Board of Trade launched a new class of catastrophe insurance options based on new indices provided by Property Claim Services (PCS). The PCS-option is based on the Property Claim Service Index (PCS-Index). The index are used to determine and payout in writing index-based insurance derivatives. The objective of this paper is to price PCS Catastrophe Insurance Option based on PCS Catastrophe index. Gamma Distribution is used to estimate PCS Catastrophe index distribution.

  9. Pain Catastrophizing Correlates with Early Mild Traumatic Brain Injury Outcome

    Directory of Open Access Journals (Sweden)

    Geneviève Chaput

    2016-01-01

    Full Text Available Background. Identifying which patients are most likely to be at risk of chronic pain and other postconcussion symptoms following mild traumatic brain injury (MTBI is a difficult clinical challenge. Objectives. To examine the relationship between pain catastrophizing, defined as the exaggerated negative appraisal of a pain experience, and early MTBI outcome. Methods. This cross-sectional design included 58 patients diagnosed with a MTBI. In addition to medical chart review, postconcussion symptoms were assessed by self-report at 1 month (Time 1 and 8 weeks (Time 2 after MTBI. Pain severity, psychological distress, level of functionality, and pain catastrophizing were measured by self-report at Time 2. Results. The pain catastrophizing subscales of rumination, magnification, and helplessness were significantly correlated with pain severity (r=.31 to .44, number of postconcussion symptoms reported (r=.35 to .45, psychological distress (r=.57 to .67, and level of functionality (r=-.43 to -.29. Pain catastrophizing scores were significantly higher for patients deemed to be at high risk of postconcussion syndrome (6 or more symptoms reported at both Time 1 and Time 2. Conclusions. Higher levels of pain catastrophizing were related to adverse early MTBI outcomes. The early detection of pain catastrophizing may facilitate goal-oriented interventions to prevent or minimize the development of chronic pain and other postconcussion symptoms.

  10. Effects of Input Data Content on the Uncertainty of Simulating Water Resources

    Directory of Open Access Journals (Sweden)

    Carla Camargos

    2018-05-01

    Full Text Available The widely used, partly-deterministic Soil and Water Assessment Tool (SWAT requires a large amount of spatial input data, such as a digital elevation model (DEM, land use, and soil maps. Modelers make an effort to apply the most specific data possible for the study area to reflect the heterogeneous characteristics of landscapes. Regional data, especially with fine resolution, is often preferred. However, such data is not always available and can be computationally demanding. Despite being coarser, global data are usually free and available to the public. Previous studies revealed the importance for single investigations of different input maps. However, it remains unknown whether higher-resolution data can lead to reliable results. This study investigates how global and regional input datasets affect parameter uncertainty when estimating river discharges. We analyze eight different setups for the SWAT model for a catchment in Luxembourg, combining different land-use, elevation, and soil input data. The Metropolis–Hasting Markov Chain Monte Carlo (MCMC algorithm is used to infer posterior model parameter uncertainty. We conclude that our higher resolved DEM improves the general model performance in reproducing low flows by 10%. The less detailed soil-map improved the fit of low flows by 25%. In addition, more detailed land-use maps reduce the bias of the model discharge simulations by 50%. Also, despite presenting similar parameter uncertainty (P-factor ranging from 0.34 to 0.41 and R-factor from 0.41 to 0.45 for all setups, the results show a disparate parameter posterior distribution. This indicates that no assessment of all sources of uncertainty simultaneously is compensated by the fitted parameter values. We conclude that our result can give some guidance for future SWAT applications in the selection of the degree of detail for input data.

  11. Uncertainty analysis in Titan ionospheric simulated ion mass spectra: unveiling a set of issues for models accuracy improvement

    Science.gov (United States)

    Hébrard, Eric; Carrasco, Nathalie; Dobrijevic, Michel; Pernot, Pascal

    Ion Neutral Mass Spectrometer (INMS) aboard Cassini revealed a rich coupled ion-neutral chemistry in the ionosphere, producing heavy hydrocarbons and nitriles ions. The modeling of such a complex environment is challenging, as it requires a detailed and accurate description of the different relevant processes such as photodissociation cross sections and neutral-neutral reaction rates on one hand, and ionisation cross sections, ion-molecule and recombination reaction rates on the other hand. Underpinning models calculations, each of these processes is parameterized by kinetic constants which, when known, have been studied experimentally and/or theoretically over a range of temperatures and pressures that are most often not representative of Titan's atmosphere. The sizeable experimental and theoretical uncertainties reported in the literature merge therefore with the uncertainties resulting subsequently from the unavoidable estimations or extrapolations to Titan's atmosphere conditions. Such large overall uncertainties have to be accounted for in all resulting inferences most of all to evaluate the quality of the model definition. We have undertaken a systematic study of the uncertainty sources in the simulation of ion mass spectra as recorded by Cassini/INMS in Titan ionosphere during the T5 flyby at 1200 km. Our simulated spectra seem much less affected by the uncertainties on ion-molecule reactions than on neutral-neutral reactions. Photochemical models of Titan's atmosphere are indeed so poorly predictive at high altitudes, in the sense that their computed predictions display such large uncertainties, that we found them to give rise to bimodal and hypersensitive abundance distributions for some major compounds like acetylene C2 H2 and ethylene C2 H4 . We will show to what extent global uncertainty and sensitivity analysis enabled us to identify the causes of this bimodality and to pinpoint the key processes that mostly contribute to limit the accuracy of the

  12. Valuation of Indonesian catastrophic earthquake bonds with generalized extreme value (GEV) distribution and Cox-Ingersoll-Ross (CIR) interest rate model

    International Nuclear Information System (INIS)

    Gunardi,; Setiawan, Ezra Putranda

    2015-01-01

    Indonesia is a country with high risk of earthquake, because of its position in the border of earth’s tectonic plate. An earthquake could raise very high amount of damage, loss, and other economic impacts. So, Indonesia needs a mechanism for transferring the risk of earthquake from the government or the (reinsurance) company, as it could collect enough money for implementing the rehabilitation and reconstruction program. One of the mechanisms is by issuing catastrophe bond, ‘act-of-God bond’, or simply CAT bond. A catastrophe bond issued by a special-purpose-vehicle (SPV) company, and then sold to the investor. The revenue from this transaction is joined with the money (premium) from the sponsor company and then invested in other product. If a catastrophe happened before the time-of-maturity, cash flow from the SPV to the investor will discounted or stopped, and the cash flow is paid to the sponsor company to compensate their loss because of this catastrophe event. When we consider the earthquake only, the amount of discounted cash flow could determine based on the earthquake’s magnitude. A case study with Indonesian earthquake magnitude data show that the probability of maximum magnitude can model by generalized extreme value (GEV) distribution. In pricing this catastrophe bond, we assumed stochastic interest rate that following the Cox-Ingersoll-Ross (CIR) interest rate model. We develop formulas for pricing three types of catastrophe bond, namely zero coupon bonds, ‘coupon only at risk’ bond, and ‘principal and coupon at risk’ bond. Relationship between price of the catastrophe bond and CIR model’s parameter, GEV’s parameter, percentage of coupon, and discounted cash flow rule then explained via Monte Carlo simulation

  13. Valuation of Indonesian catastrophic earthquake bonds with generalized extreme value (GEV) distribution and Cox-Ingersoll-Ross (CIR) interest rate model

    Energy Technology Data Exchange (ETDEWEB)

    Gunardi,; Setiawan, Ezra Putranda [Mathematics Department, Gadjah Mada University (Indonesia)

    2015-12-22

    Indonesia is a country with high risk of earthquake, because of its position in the border of earth’s tectonic plate. An earthquake could raise very high amount of damage, loss, and other economic impacts. So, Indonesia needs a mechanism for transferring the risk of earthquake from the government or the (reinsurance) company, as it could collect enough money for implementing the rehabilitation and reconstruction program. One of the mechanisms is by issuing catastrophe bond, ‘act-of-God bond’, or simply CAT bond. A catastrophe bond issued by a special-purpose-vehicle (SPV) company, and then sold to the investor. The revenue from this transaction is joined with the money (premium) from the sponsor company and then invested in other product. If a catastrophe happened before the time-of-maturity, cash flow from the SPV to the investor will discounted or stopped, and the cash flow is paid to the sponsor company to compensate their loss because of this catastrophe event. When we consider the earthquake only, the amount of discounted cash flow could determine based on the earthquake’s magnitude. A case study with Indonesian earthquake magnitude data show that the probability of maximum magnitude can model by generalized extreme value (GEV) distribution. In pricing this catastrophe bond, we assumed stochastic interest rate that following the Cox-Ingersoll-Ross (CIR) interest rate model. We develop formulas for pricing three types of catastrophe bond, namely zero coupon bonds, ‘coupon only at risk’ bond, and ‘principal and coupon at risk’ bond. Relationship between price of the catastrophe bond and CIR model’s parameter, GEV’s parameter, percentage of coupon, and discounted cash flow rule then explained via Monte Carlo simulation.

  14. Uncertainty analysis of atmospheric deposition simulation of radiocesium and radioiodine from Fukushima Daiichi Nuclear Power Plant

    Science.gov (United States)

    Morino, Yu; Ohara, Toshimasa; Yumimoto, Keiya

    2014-05-01

    Chemical transport models (CTM) played key roles in understanding the atmospheric behaviors and deposition patterns of radioactive materials emitted from the Fukushima Daiichi nuclear power plant (FDNPP) after the nuclear accident that accompanied the great Tohoku earthquake and tsunami on 11 March 2011. In this study, we assessed uncertainties of atmospheric simulation by comparing observed and simulated deposition of radiocesium (137Cs) and radioiodine (131I). Airborne monitoring survey data were used to assess the model performance of 137Cs deposition patterns. We found that simulation using emissions estimated with a regional-scale (~500 km) CTM better reproduced the observed 137Cs deposition pattern in eastern Japan than simulation using emissions estimated with local-scale (~50 km) or global-scale CTM. In addition, we estimated the emission amount of 137Cs from FDNPP by combining a CTM, a priori source term, and observed deposition data. This is the first use of airborne survey data of 137Cs deposition (more than 16,000 data points) as the observational constraints in inverse modeling. The model simulation driven by a posteriori source term achieved better agreements with 137Cs depositions measured by aircraft survey and at in-situ stations over eastern Japan. Wet deposition module was also evaluated. Simulation using a process-based wet deposition module reproduced the observations well, whereas simulation using scavenging coefficients showed large uncertainties associated with empirical parameters. The best-available simulation reproduced the observed 137Cs deposition rates in high-deposition areas (≥10 kBq m-2) within one order of magnitude. Recently, 131I deposition map was released and helped to evaluate model performance of 131I deposition patterns. Observed 131I/137Cs deposition ratio is higher in areas southwest of FDNPP than northwest of FDNPP, and this behavior was roughly reproduced by a CTM if we assume that released 131I is more in gas phase

  15. Catastrophic household expenditure on health in Nepal: a cross-sectional survey.

    Science.gov (United States)

    Saito, Eiko; Gilmour, Stuart; Rahman, Md Mizanur; Gautam, Ghan Shyam; Shrestha, Pradeep Krishna; Shibuya, Kenji

    2014-10-01

    To determine the incidence of - and illnesses commonly associated with - catastrophic household expenditure on health in Nepal. We did a cross-sectional population-based survey in five municipalities of Kathmandu Valley between November 2011 and January 2012. For each household surveyed, out-of-pocket spending on health in the previous 30 days that exceeded 10% of the household's total expenditure over the same period was considered to be catastrophic. We estimated the incidence and intensity of catastrophic health expenditure. We identified the illnesses most commonly associated with such expenditure using a Poisson regression model and assessed the distribution of expenditure by economic quintile of households using the concentration index. Overall, 284 of the 1997 households studied in Kathmandu, i.e. 13.8% after adjustment by sampling weight, reported catastrophic health expenditure in the 30 days before the survey. After adjusting for confounders, this expenditure was found to be associated with injuries, particularly those resulting from road traffic accidents. Catastrophic expenditure by households in the poorest quintile were associated with at least one episode of diabetes, asthma or heart disease. In an urban area of Nepal, catastrophic household expenditure on health was mostly associated with injuries and noncommunicable diseases such as diabetes and asthma. Throughout Nepal, interventions for the control and management of noncommunicable diseases and the prevention of road traffic accidents should be promoted. A phased introduction of health insurance should also reduce the incidence of catastrophic household expenditure.

  16. Analyzing the uncertainty of ensemble-based gridded observations in land surface simulations and drought assessment

    Science.gov (United States)

    Ahmadalipour, Ali; Moradkhani, Hamid

    2017-12-01

    Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.

  17. Bayesian integration of flux tower data into a process-based simulator for quantifying uncertainty in simulated output

    Science.gov (United States)

    Raj, Rahul; van der Tol, Christiaan; Hamm, Nicholas Alexander Samuel; Stein, Alfred

    2018-01-01

    Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP) data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT), ratio of fine root carbon to leaf carbon (FRC : LC), ratio of carbon to nitrogen in leaf (C : Nleaf), canopy water interception coefficient (Wint), fraction of leaf nitrogen in RuBisCO (FLNR), and effective soil rooting depth (SD) characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash-Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.

  18. Bayesian integration of flux tower data into a process-based simulator for quantifying uncertainty in simulated output

    Directory of Open Access Journals (Sweden)

    R. Raj

    2018-01-01

    Full Text Available Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT, ratio of fine root carbon to leaf carbon (FRC : LC, ratio of carbon to nitrogen in leaf (C : Nleaf, canopy water interception coefficient (Wint, fraction of leaf nitrogen in RuBisCO (FLNR, and effective soil rooting depth (SD characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash–Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.

  19. Self-optimized construction of transition rate matrices from accelerated atomistic simulations with Bayesian uncertainty quantification

    Science.gov (United States)

    Swinburne, Thomas D.; Perez, Danny

    2018-05-01

    A massively parallel method to build large transition rate matrices from temperature-accelerated molecular dynamics trajectories is presented. Bayesian Markov model analysis is used to estimate the expected residence time in the known state space, providing crucial uncertainty quantification for higher-scale simulation schemes such as kinetic Monte Carlo or cluster dynamics. The estimators are additionally used to optimize where exploration is performed and the degree of temperature acceleration on the fly, giving an autonomous, optimal procedure to explore the state space of complex systems. The method is tested against exactly solvable models and used to explore the dynamics of C15 interstitial defects in iron. Our uncertainty quantification scheme allows for accurate modeling of the evolution of these defects over timescales of several seconds.

  20. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Science.gov (United States)

    Rivera, Diego; Rivas, Yessica; Godoy, Alex

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  1. Exploration of DGVM Parameter Solution Space Using Simulated Annealing: Implications for Forecast Uncertainties

    Science.gov (United States)

    Wells, J. R.; Kim, J. B.

    2011-12-01

    Parameters in dynamic global vegetation models (DGVMs) are thought to be weakly constrained and can be a significant source of errors and uncertainties. DGVMs use between 5 and 26 plant functional types (PFTs) to represent the average plant life form in each simulated plot, and each PFT typically has a dozen or more parameters that define the way it uses resource and responds to the simulated growing environment. Sensitivity analysis explores how varying parameters affects the output, but does not do a full exploration of the parameter solution space. The solution space for DGVM parameter values are thought to be complex and non-linear; and multiple sets of acceptable parameters may exist. In published studies, PFT parameters are estimated from published literature, and often a parameter value is estimated from a single published value. Further, the parameters are "tuned" using somewhat arbitrary, "trial-and-error" methods. BIOMAP is a new DGVM created by fusing MAPSS biogeography model with Biome-BGC. It represents the vegetation of North America using 26 PFTs. We are using simulated annealing, a global search method, to systematically and objectively explore the solution space for the BIOMAP PFTs and system parameters important for plant water use. We defined the boundaries of the solution space by obtaining maximum and minimum values from published literature, and where those were not available, using +/-20% of current values. We used stratified random sampling to select a set of grid cells representing the vegetation of the conterminous USA. Simulated annealing algorithm is applied to the parameters for spin-up and a transient run during the historical period 1961-1990. A set of parameter values is considered acceptable if the associated simulation run produces a modern potential vegetation distribution map that is as accurate as one produced by trial-and-error calibration. We expect to confirm that the solution space is non-linear and complex, and that

  2. Essays in energy policy and planning modeling under uncertainty: Value of information, optimistic biases, and simulation of capacity markets

    Science.gov (United States)

    Hu, Ming-Che

    Optimization and simulation are popular operations research and systems analysis tools for energy policy modeling. This dissertation addresses three important questions concerning the use of these tools for energy market (and electricity market) modeling and planning under uncertainty. (1) What is the value of information and cost of disregarding different sources of uncertainty for the U.S. energy economy? (2) Could model-based calculations of the performance (social welfare) of competitive and oligopolistic market equilibria be optimistically biased due to uncertainties in objective function coefficients? (3) How do alternative sloped demand curves perform in the PJM capacity market under economic and weather uncertainty? How does curve adjustment and cost dynamics affect the capacity market outcomes? To address the first question, two-stage stochastic optimization is utilized in the U.S. national MARKAL energy model; then the value of information and cost of ignoring uncertainty are estimated for three uncertainties: carbon cap policy, load growth and natural gas prices. When an uncertainty is important, then explicitly considering those risks when making investments will result in better performance in expectation (positive expected cost of ignoring uncertainty). Furthermore, eliminating the uncertainty would improve strategies even further, meaning that improved forecasts of future conditions are valuable ( i.e., a positive expected value of information). Also, the value of policy coordination shows the difference between a strategy developed under the incorrect assumption of no carbon cap and a strategy correctly anticipating imposition of such a cap. For the second question, game theory models are formulated and the existence of optimistic (positive) biases in market equilibria (both competitive and oligopoly markets) are proved, in that calculated social welfare and producer profits will, in expectation, exceed the values that will actually be received

  3. Analyzing climate change impacts on water resources under uncertainty using an integrated simulation-optimization approach

    Science.gov (United States)

    Zhuang, X. W.; Li, Y. P.; Nie, S.; Fan, Y. R.; Huang, G. H.

    2018-01-01

    An integrated simulation-optimization (ISO) approach is developed for assessing climate change impacts on water resources. In the ISO, uncertainties presented as both interval numbers and probability distributions can be reflected. Moreover, ISO permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised water-allocation targets are violated. A snowmelt-precipitation-driven watershed (Kaidu watershed) in northwest China is selected as the study case for demonstrating the applicability of the proposed method. Results of meteorological projections disclose that the incremental trend of temperature (e.g., minimum and maximum values) and precipitation exist. Results also reveal that (i) the system uncertainties would significantly affect water resources allocation pattern (including target and shortage); (ii) water shortage would be enhanced from 2016 to 2070; and (iii) the more the inflow amount decreases, the higher estimated water shortage rates are. The ISO method is useful for evaluating climate change impacts within a watershed system with complicated uncertainties and helping identify appropriate water resources management strategies hedging against drought.

  4. The Effectiveness of Catastrophe Bonds in Portfolio Diversification

    OpenAIRE

    Mariani, Massimo; Amoruso, Paola

    2016-01-01

    The rapid growth of catastrophe bonds in financial markets is due to increasing environmental disasters and consequent economic losses, barely covered by insurance and reinsurance companies. These securities represent an effective solution, allowing the risk transfer to the capital market. The objective of this paper is to prove real advantages of the investor who operates in this market segment, in terms of portfolio diversification. The present work indeed shows how investing in catastrophe...

  5. Purchase of Catastrophe Insurance by Dutch Dairy and Arable Farmers

    NARCIS (Netherlands)

    Ogurtsov, V.; Asseldonk, van M.A.P.M.; Huirne, R.B.M.

    2009-01-01

    This article analyzed the impact of risk perception, risk attitude, and other farmer personal and farm characteristics on the actual purchase of catastrophe insurance by Dutch dairy and arable farmers. The specific catastrophe insurance types considered were hail–fire–storm insurance for buildings,

  6. Effects of Uncertainties in Electric Field Boundary Conditions for Ring Current Simulations

    Science.gov (United States)

    Chen, Margaret W.; O'Brien, T. Paul; Lemon, Colby L.; Guild, Timothy B.

    2018-01-01

    Physics-based simulation results can vary widely depending on the applied boundary conditions. As a first step toward assessing the effect of boundary conditions on ring current simulations, we analyze the uncertainty of cross-polar cap potentials (CPCP) on electric field boundary conditions applied to the Rice Convection Model-Equilibrium (RCM-E). The empirical Weimer model of CPCP is chosen as the reference model and Defense Meteorological Satellite Program CPCP measurements as the reference data. Using temporal correlations from a statistical analysis of the "errors" between the reference model and data, we construct a Monte Carlo CPCP discrete time series model that can be generalized to other model boundary conditions. RCM-E simulations using electric field boundary conditions from the reference model and from 20 randomly generated Monte Carlo discrete time series of CPCP are performed for two large storms. During the 10 August 2000 storm main phase, the proton density at 10 RE at midnight was observed to be low (Dst index is bounded by the simulated Dst values. In contrast, the simulated Dst values during the recovery phases of the 10 August 2000 and 31 August 2005 storms tend to underestimate systematically the observed late Dst recovery. This suggests a need to improve the accuracy of particle loss calculations in the RCM-E model. Application of this technique can aid modelers to make efficient choices on either investing more effort on improving specification of boundary conditions or on improving descriptions of physical processes.

  7. On the predictivity of pore-scale simulations: estimating uncertainties with multilevel Monte Carlo

    KAUST Repository

    Icardi, Matteo

    2016-02-08

    A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another “equivalent” sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [2015. https://bitbucket.org/micardi/porescalemc.], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers

  8. Uncertainty estimation and risk prediction in air quality

    International Nuclear Information System (INIS)

    Garaud, Damien

    2011-01-01

    This work is about uncertainty estimation and risk prediction in air quality. Firstly, we build a multi-model ensemble of air quality simulations which can take into account all uncertainty sources related to air quality modeling. Ensembles of photochemical simulations at continental and regional scales are automatically generated. Then, these ensemble are calibrated with a combinatorial optimization method. It selects a sub-ensemble which is representative of uncertainty or shows good resolution and reliability for probabilistic forecasting. This work shows that it is possible to estimate and forecast uncertainty fields related to ozone and nitrogen dioxide concentrations or to improve the reliability of threshold exceedance predictions. The approach is compared with Monte Carlo simulations, calibrated or not. The Monte Carlo approach appears to be less representative of the uncertainties than the multi-model approach. Finally, we quantify the observational error, the representativeness error and the modeling errors. The work is applied to the impact of thermal power plants, in order to quantify the uncertainty on the impact estimates. (author) [fr

  9. Pediatric catastrophic antiphospholipid syndrome: descriptive analysis of 45 patients from the "CAPS Registry".

    Science.gov (United States)

    Berman, Horacio; Rodríguez-Pintó, Ignasi; Cervera, Ricard; Gregory, Simone; de Meis, Ernesto; Rodrigues, Carlos Ewerton Maia; Aikawa, Nádia Emi; de Carvalho, Jozélio Freire; Springer, Janusz; Niedzwiecki, Maciej; Espinosa, Gerard

    2014-02-01

    Given the lack of information about catastrophic antiphospholipid syndrome (APS) in pediatric patients, the objective of the current study was to describe the clinical characteristics, laboratory features, treatment, and outcome of pediatric patients with catastrophic APS and compare them with the adult patients with catastrophic APS. We identified patients who were under 18years of age at time of catastrophic APS diagnosis included in the international registry of patients with catastrophic APS (CAPS Registry). Their main demographic and clinical characteristics, laboratory features, treatment, and outcome were described and compared with those of adult patients with catastrophic APS. From the 446 patients included in the CAPS Registry as of May 2013, 45 (10.3%) patients developed 46 catastrophic events before 18years of age (one patient presented two episodes). Overall, 32 (71.1%) patients were female and the mean age was 11.5±4.6years (range, 3months-18years). A total of 31 (68.9%) patients suffered from primary APS and 13 (28.9%) from systemic lupus erythematosus (SLE). The main differences between the two groups of patients were the higher prevalence of infections as precipitating factor for catastrophic event in the pediatric population (60.9% versus 26.8% in the adult population, p<0.001) and of peripheral vessel thrombosis (52.2% versus 34.3%, p=0.017). In addition, catastrophic APS was the first manifestation of APS more frequently in pediatric patients (86.6% versus 45.2%, p<0.001). Interestingly, pediatric patients showed a trend of lower mortality, although the difference was not statistically significant (26.1% versus 40.2%; odds ratio, 1.9; 95% confidence interval, 0.96-3.79; p=0.063). No differences were found neither in the laboratory features nor in the isolated or combination treatments between groups. Catastrophic APS in pediatric patients is a rare disease. There are minimal differences in the clinical and laboratory features, treatment, and

  10. Protocols of a catastrophe

    International Nuclear Information System (INIS)

    Stscherbak, J.

    1988-01-01

    In unusually frank terms the author, a journalist and epidemiologist, describes the catastrophe of Chernobyl as the 'most pathetic and important' experience of the Soviet people after World War II. Documents, interviews and statements of persons concerned trace the disaster of those days that surpasses imagination and describe how individual persons witnessed the coming true of visions of terror. (orig./HSCH) [de

  11. Cosmic Catastrophes

    Science.gov (United States)

    Wheeler, J. Craig

    2014-08-01

    Preface; 1. Setting the stage: star formation and hydrogen burning in single stars; 2. Stellar death: the inexorable grip of gravity; 3. Dancing with stars: binary stellar evolution; 4. Accretion disks: flat stars; 5. White Dwarfs: quantum dots; 6. Supernovae: stellar catastrophes; 7. Supernova 1987A: lessons and enigmas; 8. Neutron stars: atoms with attitude; 9. Black holes in theory: into the abyss; 10. Black holes in fact: exploring the reality; 11. Gamma-ray bursts, black holes and the universe: long, long ago and far, far away; 12. Supernovae and the universe; 13. Worm holes and time machines: tunnels in space and time; 14. Beyond: the frontiers; Index.

  12. Understanding Climate Uncertainty with an Ocean Focus

    Science.gov (United States)

    Tokmakian, R. T.

    2009-12-01

    Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in

  13. A Study on Data Base for the Pyroprocessing Material Flow and MUF Uncertainty Simulation

    International Nuclear Information System (INIS)

    Sitompul, Yos Panagaman; Shin, Heesung; Han, Boyoung; Kim, Hodong

    2011-01-01

    The data base for the pyroprocessing material flow and MUF uncertainty simulation has been implemented well. There is no error in the data base processing and it is relatively fast by using OLEDB and MySQL. The important issue is the data base size. In OLEDB the data base size is limited to 2 Gb. To reduce the data base size, we give an option for users to filter the input nuclides based on their masses and activities. A simulation program called PYMUS has been developed to study the pyroprocessing material flow and MUF. In the program, there is a data base system that controls the data processing in the simulation. The data base system consists of input data base, data processing, and output data base. The data base system has been designed in such a way to be efficient. One example is using the OLEDB and MySQL. The data base system is explained in detail in this paper. The result shows that the data base system works well in the simulation

  14. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  15. Simulation and uncertainties of the heat transfer from a heat-generating DEBRIS bed in the lower plenum

    International Nuclear Information System (INIS)

    Schaaf, K.; Trambauer, K.

    1999-01-01

    The findings of the TMI-2 post-accident analyses indicated that internal cooling mechanisms may have a considerable potential to sustain the vessel integrity after a relocation of core material to the lower plenum, provided that water is continuously available in the RPV. Numerous analytical and experimental research activities are currently underway in this respect. This paper illustrates some major findings of the experimental work on internal cooling mechanisms and describes the limitations and the uncertainties in the simulation of the heat transfer processes. Reference is made especially to the joint German DEBRIS/ RPV research program, which encompasses the experimental investigation of the thermal-hydraulics in gaps, of the heat transfer within a particulate debris bed, and of the high temperature performance of vessel steel, as well as the development of simulation models for the heat transfer in the lower head and the structural response of the RPV. In particular, the results of uncertainty and sensitivity analyses are presented, which have been carried out at GRS using an integral model that describes the major phenomena governing the long-term integrity of the reactor vessel. The investigation of a large-scale relocation indicated that the verification of a gap cooling mechanism as an inherent mechanism is questionable in terms of a stringent probabilistic uncertainty criterion, as long as the formation of a large molten pool cannot be excluded. (author)

  16. An exploration of the option space in student design projects for uncertainty and sensitivity analysis with performance simulation

    NARCIS (Netherlands)

    Struck, C.; Wilde, de P.J.C.J.; Hopfe, C.J.; Hensen, J.L.M.

    2008-01-01

    This paper describes research conducted to gather empirical evidence on extent, character and content of the option space in building design projects, from the perspective of a climate engineer using building performance simulation for concept evaluation. The goal is to support uncertainty analysis

  17. Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.

    Science.gov (United States)

    Kelly, David; Majda, Andrew J; Tong, Xin T

    2015-08-25

    The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.

  18. Uncertainty in simulating wheat yields under climate change : Letter

    NARCIS (Netherlands)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Supit, I.

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic

  19. Uncertainty quantification using evidence theory in multidisciplinary design optimization

    International Nuclear Information System (INIS)

    Agarwal, Harish; Renaud, John E.; Preston, Evan L.; Padmanabhan, Dhanesh

    2004-01-01

    Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories. After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems

  20. Community resilience and decision theory challenges for catastrophic events.

    Science.gov (United States)

    Cox, Louis Anthony

    2012-11-01

    Extreme and catastrophic events pose challenges for normative models of risk management decision making. They invite development of new methods and principles to complement existing normative decision and risk analysis. Because such events are rare, it is difficult to learn about them from experience. They can prompt both too little concern before the fact, and too much after. Emotionally charged and vivid outcomes promote probability neglect and distort risk perceptions. Aversion to acting on uncertain probabilities saps precautionary action; moral hazard distorts incentives to take care; imperfect learning and social adaptation (e.g., herd-following, group-think) complicate forecasting and coordination of individual behaviors and undermine prediction, preparation, and insurance of catastrophic events. Such difficulties raise substantial challenges for normative decision theories prescribing how catastrophe risks should be managed. This article summarizes challenges for catastrophic hazards with uncertain or unpredictable frequencies and severities, hard-to-envision and incompletely described decision alternatives and consequences, and individual responses that influence each other. Conceptual models and examples clarify where and why new methods are needed to complement traditional normative decision theories for individuals and groups. For example, prospective and retrospective preferences for risk management alternatives may conflict; procedures for combining individual beliefs or preferences can produce collective decisions that no one favors; and individual choices or behaviors in preparing for possible disasters may have no equilibrium. Recent ideas for building "disaster-resilient" communities can complement traditional normative decision theories, helping to meet the practical need for better ways to manage risks of extreme and catastrophic events. © 2012 Society for Risk Analysis.

  1. Dynamical systems V bifurcation theory and catastrophe theory

    CERN Document Server

    1994-01-01

    Bifurcation theory and catastrophe theory are two of the best known areas within the field of dynamical systems. Both are studies of smooth systems, focusing on properties that seem to be manifestly non-smooth. Bifurcation theory is concerned with the sudden changes that occur in a system when one or more parameters are varied. Examples of such are familiar to students of differential equations, from phase portraits. Moreover, understanding the bifurcations of the differential equations that describe real physical systems provides important information about the behavior of the systems. Catastrophe theory became quite famous during the 1970's, mostly because of the sensation caused by the usually less than rigorous applications of its principal ideas to "hot topics", such as the characterization of personalities and the difference between a "genius" and a "maniac". Catastrophe theory is accurately described as singularity theory and its (genuine) applications. The authors of this book, the first printing of w...

  2. Effects of Cognitive-Behavioral Therapy (CBT) on Brain Connectivity Supporting Catastrophizing in Fibromyalgia.

    Science.gov (United States)

    Lazaridou, Asimina; Kim, Jieun; Cahalan, Christine M; Loggia, Marco L; Franceschelli, Olivia; Berna, Chantal; Schur, Peter; Napadow, Vitaly; Edwards, Robert R

    2017-03-01

    Fibromyalgia (FM) is a chronic, common pain disorder characterized by hyperalgesia. A key mechanism by which cognitive-behavioral therapy (CBT) fosters improvement in pain outcomes is via reductions in hyperalgesia and pain-related catastrophizing, a dysfunctional set of cognitive-emotional processes. However, the neural underpinnings of these CBT effects are unclear. Our aim was to assess CBT's effects on the brain circuitry underlying hyperalgesia in FM patients, and to explore the role of treatment-associated reduction in catastrophizing as a contributor to normalization of pain-relevant brain circuitry and clinical improvement. In total, 16 high-catastrophizing FM patients were enrolled in the study and randomized to 4 weeks of individual treatment with either CBT or a Fibromyalgia Education (control) condition. Resting state functional magnetic resonance imaging scans evaluated functional connectivity between key pain-processing brain regions at baseline and posttreatment. Clinical outcomes were assessed at baseline, posttreatment, and 6-month follow-up. Catastrophizing correlated with increased resting state functional connectivity between S1 and anterior insula. The CBT group showed larger reductions (compared with the education group) in catastrophizing at posttreatment (PCBT produced significant reductions in both pain and catastrophizing at the 6-month follow-up (PCBT group also showed reduced resting state connectivity between S1 and anterior/medial insula at posttreatment; these reductions in resting state connectivity were associated with concurrent treatment-related reductions in catastrophizing. The results add to the growing support for the clinically important associations between S1-insula connectivity, clinical pain, and catastrophizing, and suggest that CBT may, in part via reductions in catastrophizing, help to normalize pain-related brain responses in FM.

  3. Catastrophic antiphospholipid syndrome mimicking a malignant pancreatic tumour--a case report

    NARCIS (Netherlands)

    van Wissen, S.; Bastiaansen, B. A. J.; Stroobants, A. K.; van den Dool, E. J.; Idu, M. M.; Levi, M. [=Marcel M.; Stroes, E. S. G.

    2008-01-01

    The catastrophic antiphospholipid syndrome is characterised by rapid onset thromboses, often resistant to conventional anticoagulant treatment, and resulting in life threatening multiple organ dysfunction. The diagnosis of catastrophic antiphospholipid syndrome may be difficult, predominantly due to

  4. How model and input uncertainty impact maize yield simulations in West Africa

    Science.gov (United States)

    Waha, Katharina; Huth, Neil; Carberry, Peter; Wang, Enli

    2015-02-01

    Crop models are common tools for simulating crop yields and crop production in studies on food security and global change. Various uncertainties however exist, not only in the model design and model parameters, but also and maybe even more important in soil, climate and management input data. We analyze the performance of the point-scale crop model APSIM and the global scale crop model LPJmL with different climate and soil conditions under different agricultural management in the low-input maize-growing areas of Burkina Faso, West Africa. We test the models’ response to different levels of input information from little to detailed information on soil, climate (1961-2000) and agricultural management and compare the models’ ability to represent the observed spatial (between locations) and temporal variability (between years) in crop yields. We found that the resolution of different soil, climate and management information influences the simulated crop yields in both models. However, the difference between models is larger than between input data and larger between simulations with different climate and management information than between simulations with different soil information. The observed spatial variability can be represented well from both models even with little information on soils and management but APSIM simulates a higher variation between single locations than LPJmL. The agreement of simulated and observed temporal variability is lower due to non-climatic factors e.g. investment in agricultural research and development between 1987 and 1991 in Burkina Faso which resulted in a doubling of maize yields. The findings of our study highlight the importance of scale and model choice and show that the most detailed input data does not necessarily improve model performance.

  5. Icarus's discovery: Acting on global climate change in the face of uncertainty

    International Nuclear Information System (INIS)

    Brooks, D.G.; Maracas, K.B.; Hayslip, R.M.

    1994-01-01

    The mythological character Icarus had the misfortune of learning the consequences of his decision to fly too near the sun at the same time he employed his decision. Although Daedalus tried to reduce the uncertainties of his son's decision by warning Icarus of the possible outcome, Icarus had no empirical knowledge of what would actually happen until his waxen wings melted and he fell to the sea. Like Icarus, man has no empirical knowledge or conclusive evidence today of the possible effects of global climate change. And though the consequences of policy decisions toward global climate change may not be as catastrophic as falling into the sea, the social and economic impacts of those decisions will be substantial. There are broad uncertainties related to the scientific and ecological aspects of global climate change. But clearly the ''politics'' of global climate change issues are moving at a faster rate than the science. There is a public outcry for action now, in the face of uncertainty. This paper profiles a case study of a southwestern utility's use of multi-attribute preference theory to reduce uncertainties and analyze its options for addressing global climate change issues

  6. Tackling The Global Challenge: Humanitarian Catastrophes

    Directory of Open Access Journals (Sweden)

    Kenneth V. Iserson

    2014-03-01

    Full Text Available “Humanitarian catastrophes,” conflicts and calamities generating both widespread human suffering and destructive events, require a wide range of emergency resources. This paper answers a number of questions that humanitarian catastrophes generate: Why and how do the most-developed countries—those with the resources, capabilities, and willingness to help—intervene in specific types of disasters? What ethical and legal guidelines shape our interventions? How well do we achieve our goals? It then suggests a number of changes to improve humanitarian responses, including better NGO-government cooperation, increased research on the best disaster response methods, clarification of the criteria and roles for humanitarian (military interventions, and development of post-2015 Millennium Development Goals with more accurate progress measures. [West J Emerg Med. 2014;15(2:231–240.

  7. Predictive uncertainty in auditory sequence processing

    DEFF Research Database (Denmark)

    Hansen, Niels Chr.; Pearce, Marcus T

    2014-01-01

    in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models...

  8. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  9. OpenTURNS, an open source uncertainty engineering software

    International Nuclear Information System (INIS)

    Popelin, A.L.; Dufoy, A.

    2013-01-01

    The needs to assess robust performances for complex systems have lead to the emergence of a new industrial simulation challenge: to take into account uncertainties when dealing with complex numerical simulation frameworks. EDF has taken part in the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk and Statistics. OpenTURNS includes a large variety of qualified algorithms in order to manage uncertainties in industrial studies, from the uncertainty quantification step (with possibilities to model stochastic dependence thanks to the copula theory and stochastic processes), to the uncertainty propagation step (with some innovative simulation algorithms as the ziggurat method for normal variables) and the sensitivity analysis one (with some sensitivity index based on the evaluation of means conditioned to the realization of a particular event). It also enables to build some response surfaces that can include the stochastic modeling (with the chaos polynomial method for example). Generic wrappers to link OpenTURNS to the modeling software are proposed. At last, OpenTURNS is largely documented to provide rules to help use and contribution

  10. Developing an Agent-Based Simulation System for Post-Earthquake Operations in Uncertainty Conditions: A Proposed Method for Collaboration among Agents

    Directory of Open Access Journals (Sweden)

    Navid Hooshangi

    2018-01-01

    Full Text Available Agent-based modeling is a promising approach for developing simulation tools for natural hazards in different areas, such as during urban search and rescue (USAR operations. The present study aimed to develop a dynamic agent-based simulation model in post-earthquake USAR operations using geospatial information system and multi agent systems (GIS and MASs, respectively. We also propose an approach for dynamic task allocation and establishing collaboration among agents based on contract net protocol (CNP and interval-based Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS methods, which consider uncertainty in natural hazards information during agents’ decision-making. The decision-making weights were calculated by analytic hierarchy process (AHP. In order to implement the system, earthquake environment was simulated and the damage of the buildings and a number of injuries were calculated in Tehran’s District 3: 23%, 37%, 24% and 16% of buildings were in slight, moderate, extensive and completely vulnerable classes, respectively. The number of injured persons was calculated to be 17,238. Numerical results in 27 scenarios showed that the proposed method is more accurate than the CNP method in the terms of USAR operational time (at least 13% decrease and the number of human fatalities (at least 9% decrease. In interval uncertainty analysis of our proposed simulated system, the lower and upper bounds of uncertain responses are evaluated. The overall results showed that considering uncertainty in task allocation can be a highly advantageous in the disaster environment. Such systems can be used to manage and prepare for natural hazards.

  11. Financing Losses from Catastrophic Risks

    Science.gov (United States)

    2008-11-01

    often held in the form of bonds, the interest on which is subject to corporate income tax , which reduces the net earnings to each insurer’s shareholders...course; it is a basic feature of the corporate income tax . But, as explained above, catastrophe insurance is distinguished from other types of

  12. Combining historical eyewitness accounts on tsunami-induced waves and numerical simulations for getting insights in uncertainty of source parameters

    Science.gov (United States)

    Rohmer, Jeremy; Rousseau, Marie; Lemoine, Anne; Pedreros, Rodrigo; Lambert, Jerome; benki, Aalae

    2017-04-01

    Recent tsunami events including the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami have caused many casualties and damages to structures. Advances in numerical simulation of tsunami-induced wave processes have tremendously improved forecast, hazard and risk assessment and design of early warning for tsunamis. Among the major challenges, several studies have underlined uncertainties in earthquake slip distributions and rupture processes as major contributor on tsunami wave height and inundation extent. Constraining these uncertainties can be performed by taking advantage of observations either on tsunami waves (using network of water level gauge) or on inundation characteristics (using field evidence and eyewitness accounts). Despite these successful applications, combining tsunami observations and simulations still faces several limitations when the problem is addressed for past tsunamis events like 1755 Lisbon. 1) While recent inversion studies can benefit from current modern networks (e.g., tide gauges, sea bottom pressure gauges, GPS-mounted buoys), the number of tide gauges can be very scarce and testimonies on tsunami observations can be limited, incomplete and imprecise for past tsunamis events. These observations often restrict to eyewitness accounts on wave heights (e.g., maximum reached wave height at the coast) instead of the full observed waveforms; 2) Tsunami phenomena involve a large span of spatial scales (from ocean basin scales to local coastal wave interactions), which can make the modelling very demanding: the computation time cost of tsunami simulation can be very prohibitive; often reaching several hours. This often limits the number of allowable long-running simulations for performing the inversion, especially when the problem is addressed from a Bayesian inference perspective. The objective of the present study is to overcome both afore-described difficulties in the view to combine historical observations on past tsunami-induced waves

  13. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2 O emissions.

    Science.gov (United States)

    Ehrhardt, Fiona; Soussana, Jean-François; Bellocchi, Gianni; Grace, Peter; McAuliffe, Russel; Recous, Sylvie; Sándor, Renáta; Smith, Pete; Snow, Val; de Antoni Migliorati, Massimiliano; Basso, Bruno; Bhatia, Arti; Brilli, Lorenzo; Doltra, Jordi; Dorich, Christopher D; Doro, Luca; Fitton, Nuala; Giacomini, Sandro J; Grant, Brian; Harrison, Matthew T; Jones, Stephanie K; Kirschbaum, Miko U F; Klumpp, Katja; Laville, Patricia; Léonard, Joël; Liebig, Mark; Lieffering, Mark; Martin, Raphaël; Massad, Raia S; Meier, Elizabeth; Merbold, Lutz; Moore, Andrew D; Myrgiotis, Vasileios; Newton, Paul; Pattey, Elizabeth; Rolinski, Susanne; Sharp, Joanna; Smith, Ward N; Wu, Lianhai; Zhang, Qing

    2018-02-01

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multi-species agricultural contexts. We report an international model comparison and benchmarking exercise, showing the potential of multi-model ensembles to predict productivity and nitrous oxide (N 2 O) emissions for wheat, maize, rice and temperate grasslands. Using a multi-stage modelling protocol, from blind simulations (stage 1) to partial (stages 2-4) and full calibration (stage 5), 24 process-based biogeochemical models were assessed individually or as an ensemble against long-term experimental data from four temperate grassland and five arable crop rotation sites spanning four continents. Comparisons were performed by reference to the experimental uncertainties of observed yields and N 2 O emissions. Results showed that across sites and crop/grassland types, 23%-40% of the uncalibrated individual models were within two standard deviations (SD) of observed yields, while 42 (rice) to 96% (grasslands) of the models were within 1 SD of observed N 2 O emissions. At stage 1, ensembles formed by the three lowest prediction model errors predicted both yields and N 2 O emissions within experimental uncertainties for 44% and 33% of the crop and grassland growth cycles, respectively. Partial model calibration (stages 2-4) markedly reduced prediction errors of the full model ensemble E-median for crop grain yields (from 36% at stage 1 down to 4% on average) and grassland productivity (from 44% to 27%) and to a lesser and more variable extent for N 2 O emissions. Yield-scaled N 2 O emissions (N 2 O emissions divided by crop yields) were ranked accurately by three-model ensembles across crop species and field sites. The potential of using process-based model ensembles to predict jointly

  14. Catastrophizing and Depressive Symptoms as Prospective Predictors of Outcomes Following Total Knee Replacement

    Directory of Open Access Journals (Sweden)

    Robert R Edwards

    2009-01-01

    Full Text Available Several recent reports suggest that pain-related catastrophizing is a risk factor for poor acute pain outcomes following surgical interventions. However, it has been less clear whether levels of catastrophizing influence longer-term postoperative outcomes. Data were analyzed from a relatively small number (n=43 of patients who underwent total knee replacement and were followed for 12 months after their surgery. Previous research has suggested that high levels of both catastrophizing and depression are associated with elevated acute postoperative pain complaints among patients undergoing knee surgery. In this sample, catastrophizing and depression at each of the assessment points were studied as prospective predictors of pain (both global pain ratings and pain at night at the subsequent assessment point over the course of one year. The predictive patterns differed somewhat across measures of pain reporting; depressive symptoms were unique predictors of greater global pain complaints, while catastrophizing was a specific and unique predictor of elevated nighttime pain. While surgical outcomes following total knee replacement are, on average, quite good, a significant minority of patients continue to experience long-term pain. The present findings suggest that high levels of catastrophizing and depression may promote enhanced pain levels, indicating that interventions designed to reduce catastrophizing and depressive symptoms may have the potential to further improve joint replacement outcomes.

  15. Future Simulated Intensification of Precipitation Extremes, CMIP5 Model Uncertainties and Dependencies

    Science.gov (United States)

    Bador, M.; Donat, M.; Geoffroy, O.; Alexander, L. V.

    2017-12-01

    Precipitation intensity during extreme events is expected to increase with climate change. Throughout the 21st century, CMIP5 climate models project a general increase in annual extreme precipitation in most regions. We investigate how robust this future increase is across different models, regions and seasons. We find that there is strong similarity in extreme precipitation changes between models that share atmospheric physics, reducing the ensemble of 27 models to 14 independent projections. We find that future simulated extreme precipitation increases in most models in the majority of land grid cells located in the dry, intermediate and wet regions according to each model's precipitation climatology. These increases significantly exceed the range of natural variability estimated from long equilibrium control runs. The intensification of extreme precipitation across the entire spectrum of dry to wet regions is particularly robust in the extra-tropics in both wet and dry season, whereas uncertainties are larger in the tropics. The CMIP5 ensemble therefore indicates robust future intensification of annual extreme rainfall in particular in extra-tropical regions. Generally, the CMIP5 robustness is higher during the dry season compared to the wet season and the annual scale, but inter-model uncertainties in the tropics remain important.

  16. The critical catastrophe revisited

    International Nuclear Information System (INIS)

    De Mulatier, Clélia; Rosso, Alberto; Dumonteil, Eric; Zoia, Andrea

    2015-01-01

    The neutron population in a prototype model of nuclear reactor can be described in terms of a collection of particles confined in a box and undergoing three key random mechanisms: diffusion, reproduction due to fissions, and death due to absorption events. When the reactor is operated at the critical point, and fissions are exactly compensated by absorptions, the whole neutron population might in principle go to extinction because of the wild fluctuations induced by births and deaths. This phenomenon, which has been named critical catastrophe, is nonetheless never observed in practice: feedback mechanisms acting on the total population, such as human intervention, have a stabilizing effect. In this work, we revisit the critical catastrophe by investigating the spatial behaviour of the fluctuations in a confined geometry. When the system is free to evolve, the neutrons may display a wild patchiness (clustering). On the contrary, imposing a population control on the total population acts also against the local fluctuations, and may thus inhibit the spatial clustering. The effectiveness of population control in quenching spatial fluctuations will be shown to depend on the competition between the mixing time of the neutrons (i.e. the average time taken for a particle to explore the finite viable space) and the extinction time

  17. A comparison of regional and global catastrophic hazards associated with energy technologies

    International Nuclear Information System (INIS)

    Heising, C.D.; Inhaber, H.

    1983-01-01

    This paper reviews some of what is known about the relative catastrophic hazards, on both a regional and global level, of energy technologies, and proposes a logical framework for their comparison. A review of the Inhaber study results is made indicating the relative position of overall nuclear power related risks. Then, concentration is placed on describing the catastrophic and global hazards of energy technologies. Regionally catastrophic hazards include sabotage and other malicious human activities, in addition to severe accidents caused inadvertantly by man, such as fires, reactor core damage events, chemical and poisonous gas releases, fuel storage fires and explosions, in addition to others. Global risks include such hazards as nuclear proliferation, CO 2 , build-up, oil shortages and possible national conflicts over dwindling energy fuels. The conclusion is drawn that consideration of both regional and global catastrophic risks must be made in making energy decisions, and that further study is necessary to better quantify and compare these risks. A simple decision analytic framework for making energy decisions inclusive of catastrophic risk is proposed

  18. Thermal catastrophe in the plasma sheet boundary layer

    International Nuclear Information System (INIS)

    Smith, R.A.; Goertz, C.K.; Grossmann, W.

    1986-01-01

    This letter presents a first step towards a substorm model including particle heating and transport in the plasma sheet boundary layer (PSBL). The heating mechanism discussed is resonant absorption of Alfven waves. For some assumed MHD perturbation incident from the tail lobes onto the plasma sheet, the local heating rate in the PSBL has the form of a resonance function of the one-fluid plasma temperature. Balancing the local heating by convective transport of the heated plasma toward the central plasma sheet, and ''equation of state'' is found for the steady-state PSBL whose solution has the form of a mathematical catastrophe: at a critical value of a parameter containing the incident power flux, the local density, and the convection velocity, the equilibrium temperature jumps discontinuously. Associating this temperature increase with the abrupt onset of the substorm expansion phase, the catastrophe model indicates at least three ways in which the onset may be triggered. Several other consequences related to substorm dynamics are suggested by the simple catastrophe model

  19. Uncertainty and sensitivity assessments of GPS and GIS integrated applications for transportation.

    Science.gov (United States)

    Hong, Sungchul; Vonderohe, Alan P

    2014-02-10

    Uncertainty and sensitivity analysis methods are introduced, concerning the quality of spatial data as well as that of output information from Global Positioning System (GPS) and Geographic Information System (GIS) integrated applications for transportation. In the methods, an error model and an error propagation method form a basis for formulating characterization and propagation of uncertainties. They are developed in two distinct approaches: analytical and simulation. Thus, an initial evaluation is performed to compare and examine uncertainty estimations from the analytical and simulation approaches. The evaluation results show that estimated ranges of output information from the analytical and simulation approaches are compatible, but the simulation approach rather than the analytical approach is preferred for uncertainty and sensitivity analyses, due to its flexibility and capability to realize positional errors in both input data. Therefore, in a case study, uncertainty and sensitivity analyses based upon the simulation approach is conducted on a winter maintenance application. The sensitivity analysis is used to determine optimum input data qualities, and the uncertainty analysis is then applied to estimate overall qualities of output information from the application. The analysis results show that output information from the non-distance-based computation model is not sensitive to positional uncertainties in input data. However, for the distance-based computational model, output information has a different magnitude of uncertainties, depending on position uncertainties in input data.

  20. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  1. Precipitation intensity-duration-frequency curves for central Belgium with an ensemble of EURO-CORDEX simulations, and associated uncertainties

    Science.gov (United States)

    Hosseinzadehtalaei, Parisa; Tabari, Hossein; Willems, Patrick

    2018-02-01

    An ensemble of 88 regional climate model (RCM) simulations at 0.11° and 0.44° spatial resolutions from the EURO-CORDEX project is analyzed for central Belgium to investigate the projected impact of climate change on precipitation intensity-duration-frequency (IDF) relationships and extreme precipitation quantiles typically used in water engineering designs. The rate of uncertainty arising from the choice of RCM, driving GCM, and radiative concentration pathway (RCP4.5 & RCP8.5) is quantified using a variance decomposition technique after reconstruction of missing data in GCM × RCM combinations. A comparative analysis between the historical simulations of the EURO-CORDEX 0.11° and 0.44° RCMs shows higher precipitation intensities by the finer resolution runs, leading to a larger overestimation of the observations-based IDFs by the 0.11° runs. The results reveal that making a temporal stationarity assumption for the climate system may lead to underestimation of precipitation quantiles up to 70% by the end of this century. This projected increase is generally larger for the 0.11° RCMs compared with the 0.44° RCMs. The relative changes in extreme precipitation do depend on return period and duration, indicating an amplification for larger return periods and for smaller durations. The variance decomposition approach generally identifies RCM as the most dominant component of uncertainty in changes of more extreme precipitation (return period of 10 years) for both 0.11° and 0.44° resolutions, followed by GCM and RCP scenario. The uncertainties associated with cross-contributions of RCMs, GCMs, and RCPs play a non-negligible role in the associated uncertainties of the changes.

  2. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    Science.gov (United States)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in

  3. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    Science.gov (United States)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  4. Forecasting giant, catastrophic slope collapse: lessons from Vajont, Northern Italy

    Science.gov (United States)

    Kilburn, Christopher R. J.; Petley, David N.

    2003-08-01

    Rapid, giant landslides, or sturzstroms, are among the most powerful natural hazards on Earth. They have minimum volumes of ˜10 6-10 7 m 3 and, normally preceded by prolonged intervals of accelerating creep, are produced by catastrophic and deep-seated slope collapse (loads ˜1-10 MPa). Conventional analyses attribute rapid collapse to unusual mechanisms, such as the vaporization of ground water during sliding. Here, catastrophic collapse is related to self-accelerating rock fracture, common in crustal rocks at loads ˜1-10 MPa and readily catalysed by circulating fluids. Fracturing produces an abrupt drop in resisting stress. Measured stress drops in crustal rock account for minimum sturzstrom volumes and rapid collapse accelerations. Fracturing also provides a physical basis for quantitatively forecasting catastrophic slope failure.

  5. Propagation of void fraction uncertainty measures in the RETRAN-3D simulation of the Peach Bottom turbine trip

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2011-01-01

    The paper describes the propagation of void fraction uncertainty, as quantified by employing a novel methodology developed at Paul Scherrer Institut, in the RETRAN-3D simulation of the Peach Bottom turbine trip test. Since the transient considered is characterized by a strong coupling between thermal-hydraulics and neutronics, the accuracy in the void fraction model has a very important influence on the prediction of the power history and, in particular, of the maximum power reached. It has been shown that the objective measures used for the void fraction uncertainty, based on the direct comparison between experimental and predicted values extracted from a database of appropriate separate-effect tests, provides power uncertainty bands that are narrower and more realistic than those based, for example, on expert opinion. The applicability of such an approach to best estimate, nuclear power plant transient analysis has thus been demonstrated.

  6. Catastrophizing and perceived injustice: risk factors for the transition to chronicity after whiplash injury.

    Science.gov (United States)

    Sullivan, Michael J L; Adams, Heather; Martel, Marc-Olivier; Scott, Whitney; Wideman, Timothy

    2011-12-01

    The article will summarize research that has supported the role of pain catastrophizing and perceived injustice as risk factors for problematic recovery after whiplash injury. This article focuses on two psychological variables that have been shown to impact on recovery trajectories after whiplash injury; namely pain catastrophizing and perceived injustice. Research has shown that psychological variables play a role in determining the trajectory of recovery after whiplash injury. This article will focus on two psychological variables that have been shown to impact on recovery trajectories after whiplash injury; namely pain catastrophizing and perceived injustice. The article will summarize research that has supported the role of pain catastrophizing and perceived injustice as risk factors for problematic recovery after whiplash injury. Several investigations have shown that measures of catastrophizing and perceived injustice prospectively predict problematic trajectories of recovery after whiplash injury. Basic research points to the potential roles of expectancies, attention, coping and endogenous opioid dysregulation as possible avenues through which catastrophizing might heighten the probability of the persistence of pain after whiplash injury. Although research has yet to systematically address the mechanisms by which perceived injustice might contribute to prolonged disability in individuals with whiplash injuries, there are grounds for suggesting the potential contributions of catastrophizing, pain behavior and anger. A challenge for future research will be the development and evaluation of risk factor-targeted interventions aimed at reducing catastrophizing and perceived injustice to improve recovery trajectories after whiplash injury.

  7. Application of adaptive hierarchical sparse grid collocation to the uncertainty quantification of nuclear reactor simulators

    Energy Technology Data Exchange (ETDEWEB)

    Yankov, A.; Downar, T. [University of Michigan, 2355 Bonisteel Blvd, Ann Arbor, MI 48109 (United States)

    2013-07-01

    Recent efforts in the application of uncertainty quantification to nuclear systems have utilized methods based on generalized perturbation theory and stochastic sampling. While these methods have proven to be effective they both have major drawbacks that may impede further progress. A relatively new approach based on spectral elements for uncertainty quantification is applied in this paper to several problems in reactor simulation. Spectral methods based on collocation attempt to couple the approximation free nature of stochastic sampling methods with the determinism of generalized perturbation theory. The specific spectral method used in this paper employs both the Smolyak algorithm and adaptivity by using Newton-Cotes collocation points along with linear hat basis functions. Using this approach, a surrogate model for the outputs of a computer code is constructed hierarchically by adaptively refining the collocation grid until the interpolant is converged to a user-defined threshold. The method inherently fits into the framework of parallel computing and allows for the extraction of meaningful statistics and data that are not within reach of stochastic sampling and generalized perturbation theory. This paper aims to demonstrate the advantages of spectral methods-especially when compared to current methods used in reactor physics for uncertainty quantification-and to illustrate their full potential. (authors)

  8. Catastrophizing and Causal Beliefs in Whiplash

    NARCIS (Netherlands)

    Buitenhuis, J.; de Jong, P. J.; Jaspers, J. P. C.; Groothoff, J. W.

    2008-01-01

    Study Design. Prospective cohort study. Objective. This study investigates the role of pain catastrophizing and causal beliefs with regard to severity and persistence of neck complaints after motor vehicle accidents. Summary of Background Data. In previous research on low back pain, somatoform

  9. Uncertainty of Water-hammer Loads for Safety Related Systems

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Chan; Yoon, Duk Joo [Korea Hydro and Nuclear Power Co., LT., Daejeon (Korea, Republic of)

    2013-10-15

    In this study, the basic methodology is base on ISO GUM (Guide to the Expression of Uncertainty in Measurements). For a given gas void volumes in the discharge piping, the maximum pressure of water hammer is defined in equation. From equation, uncertainty parameter is selected as U{sub s} (superficial velocity for the specific pipe size and corresponding area) of equation. The main uncertainty parameter (U{sub s}) is estimated by measurement method and Monte Carlo simulation. Two methods are in good agreement with the extended uncertainty. Extended uncertainty of the measurement and Monte Carlo simulation is 1.30 and 1.34 respectively in 95% confidence interval. In 99% confidence interval, the uncertainties are 1.95 and 1.97 respectively. NRC Generic Letter 2008-01 requires nuclear power plant operators to evaluate the possibility of noncondensable gas accumulation for the Emergency Core Cooling System. Specially, gas accumulation can result in system pressure transient in pump discharge piping at a pump start. Consequently, this evolves into a gas water, a water-hammer event and the force imbalances on the piping segments. In this paper, MCS (Monte Carlo Simulation) method is introduced in estimating the uncertainty of water hammer. The aim is to evaluate the uncertainty of the water hammer estimation results carried out by KHNP CRI in 2013.

  10. Nonlinear physics: Catastrophe, chaos and complexity

    International Nuclear Information System (INIS)

    Arecchi, F.T.

    1992-01-01

    Currently in the world of physics, there is open debate on the role of the three C's - catastrophe, chaos and complexity. Seen as new ideas or paradigms, incapable of being harmonized within the realm of traditional physics, these terms seem to be creating turmoil in the classical physics establishment whose foundations date back to the early seventeenth century. This paper first defines catastrophe, chaos and complexity and shows how these terms are all connected to nonlinear dynamics and how they have long since been present within scientific treatises. It also evidences the relationship of the three C's with the concept of organization, inappropriately called self-organization, and with recognition and decisional strategies of cognitive systems. Relevant to natural science, the development of these considerations is necessitating the re-examination of the role and capabilities of human knowledge and a return to inter-disciplinary scientific-philosophical debate

  11. Communications en cas de catastrophe faisant appel aux TIC pour ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Communications en cas de catastrophe faisant appel aux TIC pour les collectivités vulnérables des Caraïbes. De récents événements survenus dans les Caraïbes ont mis en relief les insuffisances des mesures régionales et nationales de préparation aux catastrophes. On manque particulièrement de systèmes d'alerte ...

  12. Stochastic simulation experiment to assess radar rainfall retrieval uncertainties associated with attenuation and its correction

    Directory of Open Access Journals (Sweden)

    R. Uijlenhoet

    2008-03-01

    Full Text Available As rainfall constitutes the main source of water for the terrestrial hydrological processes, accurate and reliable measurement and prediction of its spatial and temporal distribution over a wide range of scales is an important goal for hydrology. We investigate the potential of ground-based weather radar to provide such measurements through a theoretical analysis of some of the associated observation uncertainties. A stochastic model of range profiles of raindrop size distributions is employed in a Monte Carlo simulation experiment to investigate the rainfall retrieval uncertainties associated with weather radars operating at X-, C-, and S-band. We focus in particular on the errors and uncertainties associated with rain-induced signal attenuation and its correction for incoherent, non-polarimetric, single-frequency, operational weather radars. The performance of two attenuation correction schemes, the (forward Hitschfeld-Bordan algorithm and the (backward Marzoug-Amayenc algorithm, is analyzed for both moderate (assuming a 50 km path length and intense Mediterranean rainfall (for a 30 km path. A comparison shows that the backward correction algorithm is more stable and accurate than the forward algorithm (with a bias in the order of a few percent for the former, compared to tens of percent for the latter, provided reliable estimates of the total path-integrated attenuation are available. Moreover, the bias and root mean square error associated with each algorithm are quantified as a function of path-averaged rain rate and distance from the radar in order to provide a plausible order of magnitude for the uncertainty in radar-retrieved rain rates for hydrological applications.

  13. Uncertainties of Large-Scale Forcing Caused by Surface Turbulence Flux Measurements and the Impacts on Cloud Simulations at the ARM SGP Site

    Science.gov (United States)

    Tang, S.; Xie, S.; Tang, Q.; Zhang, Y.

    2017-12-01

    Two types of instruments, the eddy correlation flux measurement system (ECOR) and the energy balance Bowen ratio system (EBBR), are used at the Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site to measure surface latent and sensible fluxes. ECOR and EBBR typically sample different land surface types, and the domain-mean surface fluxes derived from ECOR and EBBR are not always consistent. The uncertainties of the surface fluxes will have impacts on the derived large-scale forcing data and further affect the simulations of single-column models (SCM), cloud-resolving models (CRM) and large-eddy simulation models (LES), especially for the shallow-cumulus clouds which are mainly driven by surface forcing. This study aims to quantify the uncertainties of the large-scale forcing caused by surface turbulence flux measurements and investigate the impacts on cloud simulations using long-term observations from the ARM SGP site.

  14. Catastrophic phase transitions and early warnings in a spatial ecological model

    International Nuclear Information System (INIS)

    Fernández, A; Fort, H

    2009-01-01

    Gradual changes in exploitation, nutrient loading, etc produce shifts between alternative stable states (ASS) in ecosystems which, quite often, are not smooth but abrupt or catastrophic. Early warnings of such catastrophic regime shifts are fundamental for designing management protocols for ecosystems. Here we study the spatial version of a popular ecological model, involving a logistically growing single species subject to exploitation, which is known to exhibit ASS. Spatial heterogeneity is introduced by a carrying capacity parameter varying from cell to cell in a regular lattice. Transport of biomass among cells is included in the form of diffusion. We investigate whether different quantities from statistical mechanics—like the variance, the two-point correlation function and the patchiness—may serve as early warnings of catastrophic phase transitions between the ASS. In particular, we find that the patch-size distribution follows a power law when the system is close to the catastrophic transition. We also provide links between spatial and temporal indicators and analyse how the interplay between diffusion and spatial heterogeneity may affect the earliness of each of the observables. We find that possible remedial procedures, which can be followed after these early signals, become more effective as the diffusion becomes lower. Finally, we comment on similarities of and differences between these catastrophic shifts and paradigmatic thermodynamic phase transitions like the liquid–vapour change of state for a fluid like water

  15. Usages des TIC et rapports a l’incertitude en situation de catastrophes naturelles

    Directory of Open Access Journals (Sweden)

    Claire Brossaud

    2008-11-01

    Full Text Available Cet article montre comment des professionnels et des sinistrés qui ont été confrontés à des catastrophes naturelles sur trois territoires distincts - la tempête à Limoges en 1999, les inondations à Abbeville (2001 et à Bourg-en-Bresse (2005 - ont construit une histoire commune du risque au moyen des technologies de l’information et de la communication (TIC : Internet, téléphone portable, bases de données, outils de travail partagés, etc. Les usages des TIC sont d’abord resitués concrètement avant, pendant et après les événements dans un contexte historique où les sciences et techniques sont de plus en plus sollicitées pour réduire les incertitudes liées aux menaces sanitaires et écologiques. On voit ensuite s’élaborer une culture du risque sur la base de compétences socio-cognitives et relationnelles particulières face aux événements et à leur prise en charge technologique. Nous examinons en dernier ressort le rôle des TIC dans l’apprentissage d’une argumentation et d’une délibération collective sur les catastrophes, notamment grâce à la mise en place d’outils dédiés à l’étude sur le site http://www.technorisque.net.This article shows how professionnals and disaster victims involved in natural catastrophes in three different areas - Limoges storm in 1999, Abbeville and Bourg-en-Bresse floods in 2001 and 2005 – built a commun risk story with Information and communication technologies (ICT : Internet, mobiles, data bases, groupware, etc. At first, ICT uses are concretely approached before, during and after the events in a historic context where sciences and technology are growing up to reduce uncertainties of medical and ecological threats. Then, in the second part of this article, a risk culture is supported by socio-cognitive and relationnal competences towards events and their technological holding. At last, we examine ICT place in a collective argumentation and deliberation about the

  16. Effects of input data information content on the uncertainty of simulating water resources

    Science.gov (United States)

    Camargos, Carla; Julich, Stefan; Bach, Martin; Breuer, Lutz

    2017-04-01

    Hydrological models like the Soil and Water Assessment Tool (SWAT) demand a large variety of spatial input data. These are commonly available in different resolutions and result from different preprocessing methodologies. Effort is made to apply the most specific data as possible for the study area, which features heterogeneous landscape elements. Most often, modelers prefer to use regional data, especially with fine resolution, which is not always available. Instead, global datasets are considered that are more general. This study investigates how the use of global and regional input datasets may affect the simulation performance and uncertainty of the model. We analyzed eight different setups for the SWAT model, combining two of each Digital Elevation Models (DEM), soil and land use maps of diverse spatial resolution and information content. The models were calibrated to discharge at two stations across the mesoscale Haute-Sûre catchment, which is partly located in the north of Luxembourg and partly in the southeast of Belgium. The region is a rural area of about 743 km2 and mainly covered by forests and complex agricultural system and arable lands. As part of the catchment, the Upper-Sûre Lake is an important source of drinking water for Luxembourgish population, satisfying 30% of the country's demand. The Metropolis Markov Chain Monte Carlo algorithm implemented in the SPOTPY python package was used to infer posterior parameter distributions and assess parameter uncertainty. We are optimizing the mean of the Nash-Sutcliffe Efficiency (NSE) and the logarithm of NSE. We focused on soil physical, groundwater, main channel, land cover management and basin physical process parameters. Preliminary results indicate that the model has the best performance when using the regional DEM and land use map and the global soil map, indicating that SWAT cannot necessarily make use of additional soil information if they are not substantially effecting soil hydrological fluxes

  17. Madame Bovary and Catastrophism: Revolving narratives

    Directory of Open Access Journals (Sweden)

    Ruth Morris

    2011-07-01

    Full Text Available Cet article relie Madame Bovary au contexte scientifique français des années 1850, en lisant le roman de Flaubert à la lumière des théories de Cuvier. Le savant français Georges Cuvier, avec nombre de ses contemporains, explique les origines du monde à l’aide de la théorie des catastrophes. D’après cette théorie, le monde est divisé en périodes très courtes ponctuées de grandes catastrophes ou, en termes cuviériens, de « révolutions » qui ont éradiqué toute vie et ont permis au monde d’être entièrement repeuplé. Une telle conception affecte l’idée même du « temps ». Cuvier pense que la formation de la Terre est relativement récente, l’époque présente n’étant vieille que de cinq mille ans. Cette compression temporelle peut être rapportée à Madame Bovary dont le « tempo » s’accroît au fur et à mesure qu’on se rapproche du dénouement. Dans la théorie des catastrophes comme dans le roman, le temps ne suit pas une ligne chronologique. Les « révolutions » viennent briser le fil continu du temps et Emma est souvent incapable de distinguer entre le passé, le présent et le futur. Les « révolutions » servent aussi à ponctuer et à perturber le cours de la vie sur Terre en produisant des événements majeurs dans l’histoire du globe. Il en est de même dans la vie d’Emma. Son existence est marquée par des événements majeurs, comme le bal, qui créent un éclatement et une fragmentation de la temporalité, comme dans la théorie de Cuvier. Je défendrai aussi l’idée d’un lien entre la soudaineté et la violence des « révolutions » et les crises nerveuses d’Emma, qui surviennent brusquement et relèvent de l’hystérie. La conception cuviérienne de la temporalité doit enfin être envisagée au regard des théories de l’évolution, ce qui implique de réévaluer les notions d’adaptation, d’hérédité et de mort dans le roman de Flaubert.This paper locates Madame

  18. Analysis of Uncertainties in Protection Heater Delay Time Measurements and Simulations in Nb$_{3}$Sn High-Field Accelerator Magnets

    CERN Document Server

    Salmi, Tiina; Marchevsky, Maxim; Bajas, Hugo; Felice, Helene; Stenvall, Antti

    2015-01-01

    The quench protection of superconducting high-field accelerator magnets is presently based on protection heaters, which are activated upon quench detection to accelerate the quench propagation within the winding. Estimations of the heater delay to initiate a normal zone in the coil are essential for the protection design. During the development of Nb3Sn magnets for the LHC luminosity upgrade, protection heater delays have been measured in several experiments, and a new computational tool CoHDA (Code for Heater Delay Analysis) has been developed for heater design. Several computational quench analyses suggest that the efficiency of the present heater technology is on the borderline of protecting the magnets. Quantifying the inevitable uncertainties related to the measured and simulated delays is therefore of pivotal importance. In this paper, we analyze the uncertainties in the heater delay measurements and simulations using data from five impregnated high-field Nb3Sn magnets with different heater geometries. ...

  19. Analysis of Uncertainties in Protection Heater Delay Time Measurements and Simulations in Nb$_{3}$Sn High-Field Accelerator Magnets

    CERN Document Server

    Salmi, Tiina; Marchevsky, Maxim; Bajas, Hugo; Felice, Helene; Stenvall, Antti

    2015-01-01

    The quench protection of superconducting high-field accelerator magnets is presently based on protection heaters, which are activated upon quench detection to accelerate the quench propagation within the winding. Estimations of the heater delay to initiate a normal zone in the coil are essential for the protection design. During the development of Nb$_{3}$Sn magnets for the LHC luminosity upgrade, protection heater delays have been measured in several experiments, and a new computational tool CoHDA (Code for Heater Delay Analysis) has been developed for heater design. Several computational quench analyses suggest that the efficiency of the present heater technology is on the borderline of protecting the magnets. Quantifying the inevitable uncertainties related to the measured and simulated delays is therefore of pivotal importance. In this paper, we analyze the uncertainties in the heater delay measurements and simulations using data from five impregnated high-field Nb$_{3}$Sn magnets with different heater ge...

  20. Repeated checking induces uncertainty about future threat

    NARCIS (Netherlands)

    Giele, C.L.|info:eu-repo/dai/nl/318754460; Engelhard, I.M.|info:eu-repo/dai/nl/239681533; van den Hout, M.A.|info:eu-repo/dai/nl/070445354; Dek, E.C.P.|info:eu-repo/dai/nl/313959552; Damstra, Marianne; Douma, Ellen

    2015-01-01

    Studies have shown that obsessive-compulsive (OC) -like repeated checking paradoxically increases memory uncertainty. This study tested if checking also induces uncertainty about future threat by impairing the distinction between danger and safety cues. Participants (n = 54) engaged in a simulated

  1. Catastrophic event modeling. [lithium thionyl chloride batteries

    Science.gov (United States)

    Frank, H. A.

    1981-01-01

    A mathematical model for the catastrophic failures (venting or explosion of the cell) in lithium thionyl chloride batteries is presented. The phenomenology of the various processes leading to cell failure is reviewed.

  2. Time-Varying Uncertainty in Shock and Vibration Applications Using the Impulse Response

    Directory of Open Access Journals (Sweden)

    J.B. Weathers

    2012-01-01

    Full Text Available Design of mechanical systems often necessitates the use of dynamic simulations to calculate the displacements (and their derivatives of the bodies in a system as a function of time in response to dynamic inputs. These types of simulations are especially prevalent in the shock and vibration community where simulations associated with models having complex inputs are routine. If the forcing functions as well as the parameters used in these simulations are subject to uncertainties, then these uncertainties will propagate through the models resulting in uncertainties in the outputs of interest. The uncertainty analysis procedure for these kinds of time-varying problems can be challenging, and in many instances, explicit data reduction equations (DRE's, i.e., analytical formulas, are not available because the outputs of interest are obtained from complex simulation software, e.g. FEA programs. Moreover, uncertainty propagation in systems modeled using nonlinear differential equations can prove to be difficult to analyze. However, if (1 the uncertainties propagate through the models in a linear manner, obeying the principle of superposition, then the complexity of the problem can be significantly simplified. If in addition, (2 the uncertainty in the model parameters do not change during the simulation and the manner in which the outputs of interest respond to small perturbations in the external input forces is not dependent on when the perturbations are applied, then the number of calculations required can be greatly reduced. Conditions (1 and (2 characterize a Linear Time Invariant (LTI uncertainty model. This paper seeks to explain one possible approach to obtain the uncertainty results based on these assumptions.

  3. Uncertainties in Safety Analysis. A literature review

    International Nuclear Information System (INIS)

    Ekberg, C.

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs

  4. Uncertainties in Safety Analysis. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Ekberg, C [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.

  5. Catastrophic antiphospholipid syndrome and pregnancy. Clinical report.

    Science.gov (United States)

    Khizroeva, J; Bitsadze, V; Makatsariya, A

    2018-01-08

    We have observed the development of a catastrophic antiphospholipid syndrome (CAPS) in a pregnant woman hospitalized at 28 weeks of gestation with a severe preeclampsia. On the same day, an eclampsia attack developed, and an emergency surgical delivery was performed. On the third day, multiorgan failure developed. Examination showed a persistent circulation of lupus anticoagulant, high level of antibodies to cardiolipin, b2-glycoprotein I, and prothrombin. The usual diagnosis of the severe preeclampsia masked a catastrophic antiphospholipid syndrome, exacerbated by the coincident presence of several types of antiphospholipid antibodies. The first pregnancy resulted in a premature birth at 25 weeks, possibly also due to the circulation of antiphospholipid antibodies. The trigger of the catastrophic form development was the pregnancy itself, surgical intervention, and hyperhomocysteinemia. CAPS is the most severe form of antiphospholipid syndrome, manifested in multiple microthrombosis of microcirculation of vital organs and in the development of multiorgan failure against the background of the high level of antiphospholipid antibodies. CAPS is characterized by renal, cerebral, gastrointestinal, adrenal, ovarian, skin, and other forms of microthrombosis. Thrombosis recurrence is typical. Thrombotic microvasculopathy lies at the heart of multiorgan failure and manifests clinically in central nervous system lesions, adrenal insufficiency, and ARDS development. CAPS is a life-threatening condition, therefore, requires an urgent treatment. Optimal treatment of CAPS is not developed. CAPS represent a general medical multidisciplinary problem.

  6. Uncertainty Assessments of 2D and Axisymmetric Hypersonic Shock Wave - Turbulent Boundary Layer Interaction Simulations at Compression Corners

    Science.gov (United States)

    Gnoffo, Peter A.; Berry, Scott A.; VanNorman, John W.

    2011-01-01

    This paper is one of a series of five papers in a special session organized by the NASA Fundamental Aeronautics Program that addresses uncertainty assessments for CFD simulations in hypersonic flow. Simulations of a shock emanating from a compression corner and interacting with a fully developed turbulent boundary layer are evaluated herein. Mission relevant conditions at Mach 7 and Mach 14 are defined for a pre-compression ramp of a scramjet powered vehicle. Three compression angles are defined, the smallest to avoid separation losses and the largest to force a separated flow engaging more complicated flow physics. The Baldwin-Lomax and the Cebeci-Smith algebraic models, the one-equation Spalart-Allmaras model with the Catrix-Aupoix compressibility modification and two-equation models including Menter SST, Wilcox k-omega 98, and Wilcox k-omega 06 turbulence models are evaluated. Each model is fully defined herein to preclude any ambiguity regarding model implementation. Comparisons are made to existing experimental data and Van Driest theory to provide preliminary assessment of model form uncertainty. A set of coarse grained uncertainty metrics are defined to capture essential differences among turbulence models. Except for the inability of algebraic models to converge for some separated flows there is no clearly superior model as judged by these metrics. A preliminary metric for the numerical component of uncertainty in shock-turbulent-boundary-layer interactions at compression corners sufficiently steep to cause separation is defined as 55%. This value is a median of differences with experimental data averaged for peak pressure and heating and for extent of separation captured in new, grid-converged solutions presented here. This value is consistent with existing results in a literature review of hypersonic shock-turbulent-boundary-layer interactions by Roy and Blottner and with more recent computations of MacLean.

  7. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  8. Catastrophic subsidence: An environmental hazard, shelby county, Alabama

    Science.gov (United States)

    Lamoreaux, Philip E.; Newton, J. G.

    1986-03-01

    Induced sinkholes (catastrophic subsidence) are those caused or accelerated by human activities These sinkholes commonly result from a water level decline due to pumpage Construction activities in a cone of depression greatly increases the likelihood of sinkhole occurrence Almost all occur where cavities develop in unconsolidated deposits overlying solution openings in carbonate rocks. Triggering mechanisms resulting from water level declines are (1) loss of buoyant support of the water, (2) increased gradient and water velocity, (3) water-level fluctuations, and (4) induced recharge Construction activities triggering sinkhole development include ditching, removing overburden, drilling, movement of heavy equipment, blasting and the diversion and impoundment of drainage Triggering mechanisms include piping, saturation, and loading Induced sinkholes resulting from human water development/management activities are most predictable in a youthful karst area impacted by groundwater withdrawals Shape, depth, and timing of catastrophic subsidence can be predicted in general terms Remote sensing techniques are used in prediction of locations of catastrophic subsidence. This provides a basis for design and relocation of structures such as a gas pipeline, dam, or building Utilization of techniques and a case history of the relocation of a pipeline are described

  9. Using internal discharge data in a distributed conceptual model to reduce uncertainty in streamflow simulations

    Science.gov (United States)

    Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.

    2011-12-01

    Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.

  10. Uncertainty in BMP evaluation and optimization for watershed management

    Science.gov (United States)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT

  11. Catastrophic antiphospholipid syndrome in leprosy | Chewoolkar ...

    African Journals Online (AJOL)

    Catastrophic antiphospholipid syndrome is an acute and life threatening variant of antiphospholipid syndrome with a high mortality rate. Many infections are known to be accompanied by the thrombotic manifestations of this syndrome. We came across a patient of leprosy who developed bowel ischaemia secondary to ...

  12. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  13. Global sensitivity and uncertainty analysis of the nitrate leaching and crop yield simulation under different water and nitrogen management practices

    Science.gov (United States)

    Agricultural system models have become important tools in studying water and nitrogen (N) dynamics, as well as crop growth, under different management practices. Complexity in input parameters often leads to significant uncertainty when simulating dynamic processes such as nitrate leaching or crop y...

  14. Towards quantifying uncertainty in predictions of Amazon 'dieback'.

    Science.gov (United States)

    Huntingford, Chris; Fisher, Rosie A; Mercado, Lina; Booth, Ben B B; Sitch, Stephen; Harris, Phil P; Cox, Peter M; Jones, Chris D; Betts, Richard A; Malhi, Yadvinder; Harris, Glen R; Collins, Mat; Moorcroft, Paul

    2008-05-27

    Simulations with the Hadley Centre general circulation model (HadCM3), including carbon cycle model and forced by a 'business-as-usual' emissions scenario, predict a rapid loss of Amazonian rainforest from the middle of this century onwards. The robustness of this projection to both uncertainty in physical climate drivers and the formulation of the land surface scheme is investigated. We analyse how the modelled vegetation cover in Amazonia responds to (i) uncertainty in the parameters specified in the atmosphere component of HadCM3 and their associated influence on predicted surface climate. We then enhance the land surface description and (ii) implement a multilayer canopy light interception model and compare with the simple 'big-leaf' approach used in the original simulations. Finally, (iii) we investigate the effect of changing the method of simulating vegetation dynamics from an area-based model (TRIFFID) to a more complex size- and age-structured approximation of an individual-based model (ecosystem demography). We find that the loss of Amazonian rainforest is robust across the climate uncertainty explored by perturbed physics simulations covering a wide range of global climate sensitivity. The introduction of the refined light interception model leads to an increase in simulated gross plant carbon uptake for the present day, but, with altered respiration, the net effect is a decrease in net primary productivity. However, this does not significantly affect the carbon loss from vegetation and soil as a consequence of future simulated depletion in soil moisture; the Amazon forest is still lost. The introduction of the more sophisticated dynamic vegetation model reduces but does not halt the rate of forest dieback. The potential for human-induced climate change to trigger the loss of Amazon rainforest appears robust within the context of the uncertainties explored in this paper. Some further uncertainties should be explored, particularly with respect to the

  15. Assessing uncertainty and sensitivity of model parameterizations and parameters in WRF affecting simulated surface fluxes and land-atmosphere coupling over the Amazon region

    Science.gov (United States)

    Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.

    2016-12-01

    This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for

  16. Parametric uncertainty in optical image modeling

    Science.gov (United States)

    Potzick, James; Marx, Egon; Davidson, Mark

    2006-10-01

    Optical photomask feature metrology and wafer exposure process simulation both rely on optical image modeling for accurate results. While it is fair to question the accuracies of the available models, model results also depend on several input parameters describing the object and imaging system. Errors in these parameter values can lead to significant errors in the modeled image. These parameters include wavelength, illumination and objective NA's, magnification, focus, etc. for the optical system, and topography, complex index of refraction n and k, etc. for the object. In this paper each input parameter is varied over a range about its nominal value and the corresponding images simulated. Second order parameter interactions are not explored. Using the scenario of the optical measurement of photomask features, these parametric sensitivities are quantified by calculating the apparent change of the measured linewidth for a small change in the relevant parameter. Then, using reasonable values for the estimated uncertainties of these parameters, the parametric linewidth uncertainties can be calculated and combined to give a lower limit to the linewidth measurement uncertainty for those parameter uncertainties.

  17. Polarization catastrophe in nanostructures doped in photonic band gap materials

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Mahi R. [Department of Physics and Astronomy, University of Western Ontario, London N6A 3K7 (Canada)], E-mail: msingh@uwo.ca

    2008-11-30

    In the presence of the dipole-dipole interaction, we have studied a possible dielectric catastrophe in photonic band gap materials doped with an ensemble of four-level nanoparticles. It is found that the dielectric constant of the system has a singularity when the resonance energy lies within the bands. This phenomenon is known as the dielectric catastrophe. It is also found that this phenomenon depends on the strength of the dipole-dipole interaction.

  18. Catastrophic Health Expenditure and Household Impoverishment: a case of NCDs prevalence in Kenya

    Directory of Open Access Journals (Sweden)

    Daniel Mwai

    2016-03-01

    Full Text Available Introduction and problem: Non-Communicable Diseases (NCDs have become one of the leading causes of morbidity and mortality in Kenya. Their claim on financial and time resources adversely affects household welfare. Health care cost for NCDs in Kenya is predominantly paid by households as OOP. Health expenditure on NCD stands at 6.2% of Total Health Expenditure which is 0.4 % of the total gross domestic product of the country. This expenditure scenario could have implications on household welfare through catastrophic expenditure in Kenya. Most studies done on catastrophic expenditure in Kenya have not looked at the effect of NCD on poverty. Methodology: The paper has investigated the determinants of catastrophic health spending in a household with special focus on the NCDs. It has also investigated the effect of catastrophic expenditure on household welfare.A National household level survey data on expenditure and utilization is used. Controlling for endogeneity, the results revealed that NCDs and communicable diseases contribute significantly to the likelihood of a household incurring catastrophic expenditure. Results: Although all types of sicknesses have negative effects on household welfare, NCDs have more severe impacts on impoverishment. Policy wise, government and development partners should put in place a health financing plan entailing health insurance and resource pooling as a mean towards social protection. Key words:  Non-Communicable Diseases (NCD, Catastrophic Health Expenditure, endogeneity Impoverishment

  19. Uncertainty quantification's role in modeling and simulation planning, and credibility assessment through the predictive capability maturity model

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Witkowski, Walter R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-04-13

    The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationally simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). There were several critical activities that follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. Here, we introduce the subject matter for general applications but specifics are given for the failure prediction project. In addition, the first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.

  20. Model uncertainties in top-quark physics

    CERN Document Server

    Seidel, Markus

    2014-01-01

    The ATLAS and CMS collaborations at the Large Hadron Collider (LHC) are studying the top quark in pp collisions at 7 and 8 TeV. Due to the large integrated luminosity, precision measurements of production cross-sections and properties are often limited by systematic uncertainties. An overview of the modeling uncertainties for simulated events is given in this report.

  1. Optimal design of earth-moving machine elements with cusp catastrophe theory application

    Science.gov (United States)

    Pitukhin, A. V.; Skobtsov, I. G.

    2017-10-01

    This paper deals with the optimal design problem solution for the operator of an earth-moving machine with a roll-over protective structure (ROPS) in terms of the catastrophe theory. A brief description of the catastrophe theory is presented, the cusp catastrophe is considered, control parameters are viewed as Gaussian stochastic quantities in the first part of the paper. The statement of optimal design problem is given in the second part of the paper. It includes the choice of the objective function and independent design variables, establishment of system limits. The objective function is determined as mean total cost that includes initial cost and cost of failure according to the cusp catastrophe probability. Algorithm of random search method with an interval reduction subject to side and functional constraints is given in the last part of the paper. The way of optimal design problem solution can be applied to choose rational ROPS parameters, which will increase safety and reduce production and exploitation expenses.

  2. Pricing index-based catastrophe bonds: Part 2: Object-oriented design issues and sensitivity analysis

    Science.gov (United States)

    Unger, André J. A.

    2010-02-01

    This work is the second installment in a two-part series, and focuses on object-oriented programming methods to implement an augmented-state variable approach to aggregate the PCS index and introduce the Bermudan-style call feature into the proposed CAT bond model. The PCS index is aggregated quarterly using a discrete Asian running-sum formulation. The resulting aggregate PCS index augmented-state variable is used to specify the payoff (principle) on the CAT bond based on reinsurance layers. The purpose of the Bermudan-style call option is to allow the reinsurer to minimize their interest rate risk exposure on making fixed coupon payments under prevailing interest rates. A sensitivity analysis is performed to determine the impact of uncertainty in the frequency and magnitude of hurricanes on the price of the CAT bond. Results indicate that while the CAT bond is highly sensitive to the natural variability in the frequency of landfalling hurricanes between El Ninõ and non-El Ninõ years, it remains relatively insensitive to uncertainty in the magnitude of damages. In addition, results indicate that the maximum price of the CAT bond is insensitive to whether it is engineered to cover low frequency high magnitude events in a 'high' reinsurance layer relative to high frequency low magnitude events in a 'low' reinsurance layer. Also, while it is possible for the reinsurer to minimize their interest rate risk exposure on the fixed coupon payments, the impact of this risk on the price of the CAT bond appears small relative to the natural variability in the CAT bond price, and consequently catastrophic risk, due to uncertainty in the frequency and magnitude of landfalling hurricanes.

  3. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  4. Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics

    Science.gov (United States)

    Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.

  5. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  6. Precursory landforms and geologic structures of catastrophic landslides induced by typhoon Talas 2011 Japan (Invited)

    Science.gov (United States)

    Chigira, M.; Matsushi, Y.; Tsou, C.

    2013-12-01

    Our experience of catastrophic landslides induced by rainstorms and earthquakes in recent years suggests that many of them are preceded by deep-seated gravitational slope deformation. Deep-seated gravitational slope deformation continues slowly and continually and some of them transform into catastrophic failures, which cause devastating damage in wide areas. Some other types, however, do not change into catastrophic failure. Deep-seated gravitational slope deformation that preceded catastrophic failures induced by typhoon Talas 2011 Japan, had been surveyed with airborne laser scanner beforehand, of which high-resolution DEMs gave us an important clue to identify which type of topographic features of gravitational slope deformation is susceptible to catastrophic failure. We found that 26 of 39 deep-seated catastrophic landslides had small scarps along the heads of future landslides. These scarps were caused by gravitational slope deformation that preceded the catastrophic failure. Although the scarps may have been enlarged by degradation, their sizes relative to the whole slopes suggest that minimal slope deformation had occurred in the period immediately before the catastrophic failure. The scarp ratio, defined as the ratio of length of a scarp to that of the whole slope both measured along the slope line, ranged from 1% to 23%. 38% of the landslides with small scarps had scarp ratios less than 4%, and a half less than 8%. This fact suggests that the gravitational slope deformation preceded catastrophic failure was relatively small and may suggest that those slopes were under critical conditions just before catastrophic failure. The above scarp ratios may be characteristic to accretional complex with undulating, anastomosing thrust faults, which were major sliding surfaces of the typhoon-induced landslides. Eleven of the remaining 13 landslides occurred in landslide scars of previous landslides or occurred as an extension of landslide scars at the lower parts of

  7. From Catastrophizing to Recovery: a pilot study of a single-session treatment for pain catastrophizing

    Directory of Open Access Journals (Sweden)

    Darnall BD

    2014-04-01

    Full Text Available Beth D Darnall, John A Sturgeon, Ming-Chih Kao, Jennifer M Hah, Sean C MackeyDivision of Pain Medicine, Stanford Systems Neuroscience and Pain Laboratory, Stanford University School of Medicine, Palo Alto, CA, USABackground: Pain catastrophizing (PC – a pattern of negative cognitive-emotional responses to real or anticipated pain – maintains chronic pain and undermines medical treatments. Standard PC treatment involves multiple sessions of cognitive behavioral therapy. To provide efficient treatment, we developed a single-session, 2-hour class that solely treats PC entitled “From Catastrophizing to Recovery”[FCR].Objectives: To determine 1 feasibility of FCR; 2 participant ratings for acceptability, understandability, satisfaction, and likelihood to use the information learned; and 3 preliminary efficacy of FCR for reducing PC.Design and methods: Uncontrolled prospective pilot trial with a retrospective chart and database review component. Seventy-six patients receiving care at an outpatient pain clinic (the Stanford Pain Management Center attended the class as free treatment and 70 attendees completed and returned an anonymous survey immediately post-class. The Pain Catastrophizing Scale (PCS was administered at class check-in (baseline and at 2, and 4 weeks post-treatment. Within subjects repeated measures analysis of variance (ANOVA with Student's t-test contrasts were used to compare scores across time points.Results: All attendees who completed a baseline PCS were included as study participants (N=57; F=82%; mean age =50.2 years; PCS was completed by 46 participants at week 2 and 35 participants at week 4. Participants had significantly reduced PC at both time points (P<0001 and large effect sizes were found (Cohen's d=0.85 and d=1.15.Conclusion: Preliminary data suggest that FCR is an acceptable and effective treatment for PC. Larger, controlled studies of longer duration are needed to determine durability of response, factors

  8. Uncertainty analysis for the BEACON-COLSS core monitoring system application

    International Nuclear Information System (INIS)

    Morita, T.; Boyd, W.A.; Seong, K.B.

    2005-01-01

    This paper will cover the measurement uncertainty analysis of BEACON-COLSS core monitoring system. The uncertainty evaluation is made by using a BEACON-COLSS simulation program. By simulating the BEACON on-line operation for analytically generated reactor conditions, accuracy of the 'Measured' results can be evaluated by comparing to analytically generated 'Truth'. The DNB power margin is evaluated based on the Combustion Engineering's Modified Statistical Combination of Uncertainties (MSCU) using the CETOPD code for the DNBR calculation. A BEACON-COLSS simulation program for the uncertainty evaluation function has been established for plant applications. Qualification work has been completed for two Combustion Engineering plants. Results of the BEACON-COLSS measured peaking factors and DNBR power margin are plant type dependent and are applicable to reload cores as long as the core geometry and detector layout are unchanged. (authors)

  9. Effects of microtubule mechanics on hydrolysis and catastrophes

    International Nuclear Information System (INIS)

    Müller, N; Kierfeld, J

    2014-01-01

    We introduce a model for microtubule (MT) mechanics containing lateral bonds between dimers in neighboring protofilaments, bending rigidity of dimers, and repulsive interactions between protofilaments modeling steric constraints to investigate the influence of mechanical forces on hydrolysis and catastrophes. We use the allosteric dimer model, where tubulin dimers are characterized by an equilibrium bending angle, which changes from 0 ∘ to 22 ∘ by hydrolysis of a dimer. This also affects the lateral interaction and bending energies and, thus, the mechanical equilibrium state of the MT. As hydrolysis gives rise to conformational changes in dimers, mechanical forces also influence the hydrolysis rates by mechanical energy changes modulating the hydrolysis rate. The interaction via the MT mechanics then gives rise to correlation effects in the hydrolysis dynamics, which have not been taken into account before. Assuming a dominant influence of mechanical energies on hydrolysis rates, we investigate the most probable hydrolysis pathways both for vectorial and random hydrolysis. Investigating the stability with respect to lateral bond rupture, we identify initiation configurations for catastrophes along the hydrolysis pathways and values for a lateral bond rupture force. If we allow for rupturing of lateral bonds between dimers in neighboring protofilaments above this threshold force, our model exhibits avalanche-like catastrophe events. (papers)

  10. Uncertainty estimation and global forecasting with a chemistry-transport model - application to the numerical simulation of air quality; Estimation de l'incertitude et prevision d'ensemble avec un modele de chimie transport - Application a la simulation numerique de la qualite de l'air

    Energy Technology Data Exchange (ETDEWEB)

    Mallet, V

    2005-12-15

    The aim of this work is the evaluation of the quality of a chemistry-transport model, not by a classical comparison with observations, but by the estimation of its uncertainties due to the input data, to the model formulation and to the numerical approximations. The study of these 3 sources of uncertainty is carried out with Monte Carlo simulations, with multi-model simulations and with comparisons between numerical schemes, respectively. A high uncertainty is shown for ozone concentrations. To overcome the uncertainty-related limitations, a strategy consists in using the overall forecasting. By combining several models (up to 48) on the basis of past observations, forecasts can be significantly improved. This work has been also the occasion of developing an innovative modeling system, named Polyphemus. (J.S.)

  11. Uncertainty analysis of 99mTc-HEPIDA liver clearance determination

    International Nuclear Information System (INIS)

    Surma, M. J.

    2005-01-01

    The aim of the study was to obtain information on the accuracy and precision of 99mT c-HEPIDA hepatic (Cl H p) and plasma (Cl P l) clearances and selection of an appropriate estimator of the measurement uncertainty of a single determination of these quantities. In a simulation (Monte Carlo) experiment, it was assumed that the recorded results of plasma and hepatic clearances, as obtained from 185 patients, provided authentic information about 99mT c-HEPIDA behaviour in the body over a wide range of the clearances studied. The time-course 99mT c-HEPIDA concentration in blood plasma has been described by means of biexponential function with parameter values derived for each patient. For each patient, using these data and urinary excretion data, there had been 5000 simulations performed; in each of the latter, the directly measured numbers have been substituted by simulated ones, obtained by means of varying the real ones, using random generated values. These reflected errors of plasma and radioactive standard pipetting (from 1 to 5%) and stochasticity of counting radioactive decay (1%). The time of blood sampling and urine voiding was also varied, assuming realistic uncertainty. The varied values were then used for computation of the simulated clearances. From the 5000 calculated clearances for each patient, mean-values were calculated, as well as mean standard errors, standard deviations and mean uncertainty of measurements using a widely accepted rule of partial error propagation, and, in addition, a modified rule of the latter. Accuracy of clearance (Cl P l, Cl H p, Cl U r) determination was assessed on the basis of comparison of mean values from simulations with those from directly recorded values. Precision was identified with standard deviation of each of the 5000 simulations. The uncertainty thus obtained was compared with results of calculated traditional and modified uncertainty. There was good agreement between standard deviation of the simulations with

  12. Complexities, Catastrophes and Cities: Emergency Dynamics in Varying Scenarios and Urban Topologies

    Science.gov (United States)

    Narzisi, Giuseppe; Mysore, Venkatesh; Byeon, Jeewoong; Mishra, Bud

    Complex Systems are often characterized by agents capable of interacting with each other dynamically, often in non-linear and non-intuitive ways. Trying to characterize their dynamics often results in partial differential equations that are difficult, if not impossible, to solve. A large city or a city-state is an example of such an evolving and self-organizing complex environment that efficiently adapts to different and numerous incremental changes to its social, cultural and technological infrastructure [1]. One powerful technique for analyzing such complex systems is Agent-Based Modeling (ABM) [9], which has seen an increasing number of applications in social science, economics and also biology. The agent-based paradigm facilitates easier transfer of domain specific knowledge into a model. ABM provides a natural way to describe systems in which the overall dynamics can be described as the result of the behavior of populations of autonomous components: agents, with a fixed set of rules based on local information and possible central control. As part of the NYU Center for Catastrophe Preparedness and Response (CCPR1), we have been exploring how ABM can serve as a powerful simulation technique for analyzing large-scale urban disasters. The central problem in Disaster Management is that it is not immediately apparent whether the current emergency plans are robust against such sudden, rare and punctuated catastrophic events.

  13. Creating catastrophes in the classroom

    Science.gov (United States)

    Andersson, Thommy

    2013-04-01

    Buildings, infrastructure and human life are being destroyed by wind and landslides. To interest and motivate pupils and to help them understand abstract knowledge, a practical experiment could be useful. These experiments will show why strong winds circulate around tropical cyclones and how fluvial geological processes affect nature and communities. The experiments are easy to set up and the equipment is not expensive. Experiment 1: Exogenic processes of water are often slow processes. This experiment will simulate water processes that can take thousands of years, in less than 40 minutes. This experiment can be presented for and understood by pupils at all levels. Letting the pupils build up the scenery will make them more curious about the course of events. During that time they will see the geomorphological genesis of landforms such as landslides, sandurs, deltas, canyons sedimentations, selective erosions. Placing small houses, bridges etc. we can lead to discussions about natural catastrophes and community planning. Material needed for the experiment is a water bucket, erosion gutter, clay (simulating rock), sand and smaller pebbles (simulating the soil), houses of "Monopoly" size and tubes. By using a table with wheels it is easy to reuse the result for other lessons. Installation of a pump can make the experiment into a closed loop system. This installation can be used for presentations outside the classroom. Experiment 2: The Coriolis Effect explains why the wind (moving objects) deflects when moving. In the northern hemisphere the deflection is clockwise and anti-clockwise in the southern hemisphere. This abstract effect is often hard for upper secondary pupils to understand. This experiment will show the effect and thus make the theory real and visible. Material needed for this experiment is a bucket, pipes, a string. At my school we had cooperation with pupils from the Industrial Technology programme who made a copper pipe construction. During the

  14. Study on China’s Earthquake Prediction by Mathematical Analysis and its Application in Catastrophe Insurance

    Science.gov (United States)

    Jianjun, X.; Bingjie, Y.; Rongji, W.

    2018-03-01

    The purpose of this paper was to improve catastrophe insurance level. Firstly, earthquake predictions were carried out using mathematical analysis method. Secondly, the foreign catastrophe insurances’ policies and models were compared. Thirdly, the suggestions on catastrophe insurances to China were discussed. The further study should be paid more attention on the earthquake prediction by introducing big data.

  15. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  16. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  17. Special software for computing the special functions of wave catastrophes

    Directory of Open Access Journals (Sweden)

    Andrey S. Kryukovsky

    2015-01-01

    Full Text Available The method of ordinary differential equations in the context of calculating the special functions of wave catastrophes is considered. Complementary numerical methods and algorithms are described. The paper shows approaches to accelerate such calculations using capabilities of modern computing systems. Methods for calculating the special functions of wave catastrophes are considered in the framework of parallel computing and distributed systems. The paper covers the development process of special software for calculating of special functions, questions of portability, extensibility and interoperability.

  18. Statistical emulation of a tsunami model for sensitivity analysis and uncertainty quantification

    Directory of Open Access Journals (Sweden)

    A. Sarri

    2012-06-01

    Full Text Available Due to the catastrophic consequences of tsunamis, early warnings need to be issued quickly in order to mitigate the hazard. Additionally, there is a need to represent the uncertainty in the predictions of tsunami characteristics corresponding to the uncertain trigger features (e.g. either position, shape and speed of a landslide, or sea floor deformation associated with an earthquake. Unfortunately, computer models are expensive to run. This leads to significant delays in predictions and makes the uncertainty quantification impractical. Statistical emulators run almost instantaneously and may represent well the outputs of the computer model. In this paper, we use the outer product emulator to build a fast statistical surrogate of a landslide-generated tsunami computer model. This Bayesian framework enables us to build the emulator by combining prior knowledge of the computer model properties with a few carefully chosen model evaluations. The good performance of the emulator is validated using the leave-one-out method.

  19. A unified approach of catastrophic events

    Directory of Open Access Journals (Sweden)

    S. Nikolopoulos

    2004-01-01

    Full Text Available Although there is an accumulated charge of theoretical, computational, and numerical work, like catastrophe theory, bifurcation theory, stochastic and deterministic chaos theory, there is an important feeling that these matters do not completely cover the physics of real catastrophic events. Recent studies have suggested that a large variety of complex processes, including earthquakes, heartbeats, and neuronal dynamics, exhibits statistical similarities. Here we are studying in terms of complexity and non linear techniques whether isomorphic signatures emerged indicating the transition from the normal state to the both geological and biological shocks. In the last 15 years, the study of Complex Systems has emerged as a recognized field in its own right, although a good definition of what a complex system is, actually is eluded. A basic reason for our interest in complexity is the striking similarity in behaviour close to irreversible phase transitions among systems that are otherwise quite different in nature. It is by now recognized that the pre-seismic electromagnetic time-series contain valuable information about the earthquake preparation process, which cannot be extracted without the use of important computational power, probably in connection with computer Algebra techniques. This paper presents an analysis, the aim of which is to indicate the approach of the global instability in the pre-focal area. Non-linear characteristics are studied by applying two techniques, namely the Correlation Dimension Estimation and the Approximate Entropy. These two non-linear techniques present coherent conclusions, and could cooperate with an independent fractal spectral analysis to provide a detection concerning the emergence of the nucleation phase of the impending catastrophic event. In the context of similar mathematical background, it would be interesting to augment this description of pre-seismic electromagnetic anomalies in order to cover biological

  20. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...

  1. Gravothermal catastrophe of finite amplitude

    Energy Technology Data Exchange (ETDEWEB)

    Hachisu, I; Sugimoto, D [Tokyo Univ. (Japan). Coll. of General Education; Nakada, Y; Nomoto, K

    1978-08-01

    Development of the gravothermal catastrophe is followed numerically for self-gravitating gas system enclosed by an adiabatic wall, which is isothermal in the initial state. It is found that the final fate of the catastrophe is in two ways depending on the initial perturbations. When the initial perturbation produces a temperature distribution decreasing outward, the contraction proceeds in the central region and the central density increases unlimitedly, as the heat flows outward. When the initial temperature distribution is increasing outward, on the other hand, the central region expands as the heat flows into the central region. Then the density contrast is reduced and finally the system reaches another isothermal configuration with the same energy but with a lower density contrast and a higher entropy. This final configuration is gravothermally stable and may be called a thermal system. In the former case of the unlimited contraction, the final density profile is determined essentially by the density and temperature dependence of the heat conductivity. In the case of a system under the force of the inverse square law, the final density distribution is well approximated by a power law so that the mass contained in the condensed core is relatively small. A possibility of formation of a black hole in stellar systems is also discussed.

  2. Gravothermal catastrophe of finite amplitude

    International Nuclear Information System (INIS)

    Hachisu, Izumi; Sugimoto, Daiichiro; Nakada, Yoshikazu; Nomoto, Ken-ichi.

    1978-01-01

    Development of the gravothermal catastrophe is followed numerically for self-gravitating gas system enclosed by an adiabatic wall, which is isothermal in the initial state. It is found that the final fate of the catastrophe is in two ways depending on the initial perturbations. When the initial perturbation produces a temperature distribution decreasing outward, the contraction proceeds in the central region and the central density increases unlimitedly, as the heat flows outward. When the initial temperature distribution is increasing outward, on the other hand, the central region expands as the heat flows into the central region. Then the density contrast is reduced and finally the system reaches another isothermal configuration with the same energy but with a lower density contrast and a higher entropy. This final configuration is gravothermally stable and may be called a thermal system. In the former case of the unlimited contraction, the final density profile is determined essentially by the density and temperature dependence of the heat conductivity. In the case of a system under the force of the inverse square law, the final density distribution is well approximated by a power law so that the mass contained in the condensed core is relatively small. A possibility of formation of a black hole in stellar systems is also discussed. (author)

  3. Modeling, design, and simulation of systems with uncertainties

    CERN Document Server

    Rauh, Andreas

    2011-01-01

    This three-fold contribution to the field covers both theory and current research in algorithmic approaches to uncertainty handling, real-life applications such as robotics and biomedical engineering, and fresh approaches to reliably implementing software.

  4. Catastrophic risk : Social influences on insurance decisions

    NARCIS (Netherlands)

    Krawczyk, Michal; Trautmann, Stefan; van de Kuilen, Gijs

    We study behavioral patterns of insurance demand for low-probability large-loss events (catastrophic losses). Individual patterns of belief formation and risk attitude that were suggested in the behavioral decisions literature emerge robustly in the current set of insurance choices. However, social

  5. Diagnosis and management of catastrophic antiphospholipid syndrome.

    Science.gov (United States)

    Carmi, Or; Berla, Maya; Shoenfeld, Yehuda; Levy, Yair

    2017-04-01

    Catastrophic antiphospholipid syndrome (CAPS) is a rare, life-threatening disease. In 1992, Asherson defined it as a widespread coagulopathy related to the antiphospholipid antibodies (aPL). CAPS requires rapid diagnosis and prompt initiation of treatment. Areas covered: This paper discusses all aspects of CAPS, including its pathophysiology, clinical manifestations, diagnostic approaches, differential diagnoses, management and treatment of relapsing CAPS, and its prognosis. To obtain the information used in this review, scientific databases were searched using the key words antiphospholipid antibodies, catastrophic antiphospholipid syndrome, hemolytic anemia, lupus anticoagulant, and thrombotic microangiopathic hemolytic anemia. Expert commentary: CAPS is a rare variant of the antiphospholipid syndrome (APS). It is characterized by thrombosis in multiple organs and a cytokine storm developing over a short period, with histopathologic evidence of multiple microthromboses, and laboratory confirmation of high aPL titers. This review discusses the diagnostic challenges and current approaches to the treatment of CAPS.

  6. Intestinal malrotation and catastrophic volvulus in infancy.

    Science.gov (United States)

    Lee, Henry Chong; Pickard, Sarah S; Sridhar, Sunita; Dutta, Sanjeev

    2012-07-01

    Intestinal malrotation in the newborn is usually diagnosed after signs of intestinal obstruction, such as bilious emesis, and corrected with the Ladd procedure. The objective of this report is to describe the presentation of severe cases of midgut volvulus presenting in infancy, and to discuss the characteristics of these cases. We performed a 7-year review at our institution and present two cases of catastrophic midgut volvulus presenting in the post-neonatal period, ending in death soon after the onset of symptoms. These two patients also had significant laboratory abnormalities compared to patients with more typical presentations resulting in favorable outcomes. Although most cases of intestinal malrotation in infancy can be treated successfully, in some circumstances, patients' symptoms may not be detected early enough for effective treatment, and therefore may result in catastrophic midgut volvulus and death. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Dyadic analysis of child and parent trait and state pain catastrophizing in the process of children's pain communication.

    Science.gov (United States)

    Birnie, Kathryn A; Chambers, Christine T; Chorney, Jill; Fernandez, Conrad V; McGrath, Patrick J

    2016-04-01

    When explored separately, child and parent catastrophic thoughts about child pain show robust negative relations with child pain. The objective of this study was to conduct a dyadic analysis to elucidate intrapersonal and interpersonal influences of child and parent pain catastrophizing on aspects of pain communication, including observed behaviours and perceptions of child pain. A community sample of 171 dyads including children aged 8 to 12 years (89 girls) and parents (135 mothers) rated pain catastrophizing (trait and state versions) and child pain intensity and unpleasantness following a cold pressor task. Child pain tolerance was also assessed. Parent-child interactions during the cold pressor task were coded for parent attending, nonattending, and other talk, and child symptom complaints and other talk. Data were analyzed using the actor-partner interdependence model and hierarchical multiple regressions. Children reporting higher state pain catastrophizing had greater symptom complaints regardless of level of parent state pain catastrophizing. Children reporting low state pain catastrophizing had similar high levels of symptom complaints, but only when parents reported high state pain catastrophizing. Higher child and parent state and/or trait pain catastrophizing predicted their own ratings of higher child pain intensity and unpleasantness, with child state pain catastrophizing additionally predicting parent ratings. Higher pain tolerance was predicted by older child age and lower child state pain catastrophizing. These newly identified interpersonal effects highlight the relevance of the social context to children's pain expressions and parent perceptions of child pain. Both child and parent pain catastrophizing warrant consideration when managing child pain.

  8. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes; Analisis de incertidumbre y sensibilidad en la simulacion de escenarios con los codigos RELAP/SCDAP y MELCOR

    Energy Technology Data Exchange (ETDEWEB)

    Garcia J, T.; Cardenas V, J., E-mail: tonatiuh.garcia@cnsns.gob.mx [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Ciudad de Mexico (Mexico)

    2015-09-15

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  9. Financial catastrophe and poverty impacts of out-of-pocket health payments in Turkey.

    Science.gov (United States)

    Özgen Narcı, Hacer; Şahin, İsmet; Yıldırım, Hasan Hüseyin

    2015-04-01

    To determine the prevalence of catastrophic health payments, examine the determinants of catastrophic expenditures, and assess the poverty impact of out-of-pocket (OOP) payments. Data came from the 2004 to 2010 Household Budget Survey. Catastrophic health spending was defined by health payments as percentage of household consumption expenditures and capacity to pay at a set of thresholds. The poverty impact was evaluated by poverty head counts and poverty gaps before and after OOP health payments. The percentage of households that catastrophically spent their consumption expenditure and capacity to pay increased from 2004 to 2010, regardless of the threshold used. Households with a share of more than 40% health spending in both consumption expenditure and capacity to pay accounted for less than 1% across years. However, when a series of potential confounders were taken into account, the study found statistically significantly increased risk for the lowest threshold and decreased risk for the highest threshold in 2010 relative to the base year. Household income, size, education, senior and under 5-year-old members, health insurance, disabled members, payment for inpatient care and settlement were also statistically significant predictors of catastrophic health spending. Overall, poverty head counts were below 1%. Poverty gaps reached a maximum of 0.098%, with an overall increase in 2010 compared to 2004. Catastrophe and poverty increased from 2004 to 2010. However, given that the realization of some recent policies will affect the financial burden of OOP payments on households, the findings of this study need to be replicated.

  10. A comparison of catastrophic injury incidence rates by Provincial Rugby Union in South Africa.

    Science.gov (United States)

    Badenhorst, Marelise; Verhagen, Evert A L M; van Mechelen, Willem; Lambert, Michael I; Viljoen, Wayne; Readhead, Clint; Baerecke, Gail; Brown, James C

    2017-07-01

    To compare catastrophic injury rates between the 14 South African Provincial Rugby Unions. A prospective, population-based study conducted among all South African Unions between 2008-2014. Player numbers in each Union were obtained from South African Rugby's 2013 Census. Catastrophic injuries were analysed from BokSmart's serious injury database. Incidence rates with 95% Confidence Intervals were calculated. Catastrophic injuries (Acute Spinal Cord Injuries and catastrophic Traumatic Brain Injuries) within Unions were compared statistically, using a Poisson regression with Incidence Rate Ratios (IRR) and a 95% confidence level (pUnion ranged from 1.8 per 100000 players (95% CI: 0.0-6.5) to 7.9 (95% CI: 0.0-28.5) per 100000 players per year. The highest incidence rate of permanent outcome Acute Spinal Cord Injuries was reported at 7.1 per 100000 players (95% CI: 0.0-17.6). Compared to this Union, five (n=5/14, 36%) of the Unions had significantly lower incidence rates of Acute Spinal Cord Injuries. Proportionately, three Unions had more Acute Spinal Cord Injuries and three other Unions had more catastrophic Traumatic Brain Injuries. There were significant differences in the catastrophic injury incidence rates amongst the Provincial Unions in South Africa. Future studies should investigate the underlying reasons contributing to these provincial differences. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  11. Implementation of unscented transform to estimate the uncertainty of a liquid flow standard system

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Sejong; Choi, Hae-Man; Yoon, Byung-Ro; Kang, Woong [Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2017-03-15

    First-order partial derivatives of a mathematical model are an essential part of evaluating the measurement uncertainty of a liquid flow standard system according to the Guide to the expression of uncertainty in measurement (GUM). Although the GUM provides a straightforward method to evaluate the measurement uncertainty of volume flow rate, the first-order partial derivatives can be complicated. The mathematical model of volume flow rate in a liquid flow standard system has a cross-correlation between liquid density and buoyancy correction factor. This cross-correlation can make derivation of the first-order partial derivatives difficult. Monte Carlo simulation can be used as an alternative method to circumvent the difficulty in partial derivation. However, the Monte Carlo simulation requires large computational resources for a correct simulation because it considers the completeness issue whether an ideal or a real operator conducts an experiment to evaluate the measurement uncertainty. Thus, the Monte Carlo simulation needs a large number of samples to ensure that the uncertainty evaluation is as close to the GUM as possible. Unscented transform can alleviate this problem because unscented transform can be regarded as a Monte Carlo simulation with an infinite number of samples. This idea means that unscented transform considers the uncertainty evaluation with respect to the ideal operator. Thus, unscented transform can evaluate the measurement uncertainty the same as the uncertainty that the GUM provides.

  12. Uncertainty in prediction and simulation of flow in sewer systems

    DEFF Research Database (Denmark)

    Breinholt, Anders

    the uncertainty in the state variables. Additionally the observation noise is accounted for by a separate observation noise term. This approach is also referred to as stochastic grey-box modelling. A state dependent diffusion term was developed using a Lamperti transformation of the states, and implemented...... performance beyond the one-step. The reliability was satisfied for the one-step prediction but were increasingly biased as the prediction horizon was expanded, particularly in rainy periods. GLUE was applied for estimating uncertainty in such a way that the selection of behavioral parameter sets continued....... Conversely the parameter estimates of the stochastic approach are physically meaningful. This thesis has contributed to developing simplified rainfall-runoff models that are suitable for model predictive control of urban drainage systems that takes uncertainty into account....

  13. Public policy and risk financing strategies for global catastrophe risk management - the role of global risk initiatives

    Science.gov (United States)

    McSharry, Patrick; Mitchell, Andrew; Anderson, Rebecca

    2010-05-01

    Decision-makers in both public and private organisations depend on accurate data and scientific understanding to adequately address climate change and the impact of extreme events. The financial impacts of catastrophes on populations and infrastructure can be offset through effective risk transfer mechanisms, structured to reflect the specific perils and levels of exposure to be covered. Optimal strategies depend on the likely socio-econonomic impact, the institutional framework, the overall objectives of the covers placed and the level of both the frequency and severity of loss potential expected. The diversity of approaches across different countries has been documented by the Spanish "Consorcio de Compensación de Seguros". We discuss why international public/private partnerships are necessary for addressing the risk of natural catastrophes. International initiatives such as the Global Earthquake Model (GEM) and the World Forum of Catastrophe Programmes (WFCP) can provide effective guidelines for constructing natural catastrophe schemes. The World Bank has been instrumental in the creation of many of the existing schemes such as the Turkish Catastrophe Insurance Pool, the Caribbean Catastrophe Risk Insurance Facility and the Mongolian Index-Based Livestock Insurance Program. We review existing schemes and report on best practice in relation to providing protection against natural catastrophe perils. The suitability of catastrophe modelling approaches to support schemes across the world are discussed and we identify opportunities to improve risk assessment for such schemes through transparent frameworks for quantifying, pricing, sharing and financing catastrophe risk on a local and global basis.

  14. Catastrophe risk data scoping for disaster risk finance in Asia

    Science.gov (United States)

    Millinship, Ian; Revilla-Romero, Beatriz

    2017-04-01

    Developing countries across Latin America, Africa, and Asia are some of the most exposed to natural catastrophes in the world. Over the last 20 years, Asia has borne almost half the estimated global economic cost of natural disasters - around 53billion annually. Losses from natural disasters can damage growth and hamper economic development and unlike in developed countries where risk is reallocated through re/insurance, typically these countries rely on budget reallocations and donor assistance in order to attempt to meet financing needs. There is currently an active international dialogue on the need to increase access to disaster risk financing solutions in Asia. The World Bank-GFDRR Disaster Risk Financing and Insurance Program with financial support from the Rockefeller Foundation, is currently working to develop regional options for disaster risk financing for developing countries in Asia. The first stage of this process has been to evaluate available catastrophe data suitable to support the design and implementation of disaster risk financing mechanisms in selected Asian countries. This project was carried out by a consortium of JBA Risk Management, JBA Consulting, ImageCat and Cat Risk Intelligence. The project focuses on investigating potential data sources for fourteen selected countries in Asia, for flood, tropical cyclone, earthquake and drought perils. The project was carried out under four stages. The first phase focused to identify and catalogue live/dynamic hazard data sources such as hazard gauging networks, or earth observations datasets which could be used to inform a parametric trigger. Live data sources were identified that provide credibility, transparency, independence, frequent reporting, consistency and stability. Data were catalogued at regional level, and prioritised at local level for five countries: Bangladesh, Indonesia, Pakistan, Sri Lanka and Viet Nam. The second phase was to identify, catalogue and evaluate catastrophe risk models

  15. Effects of input uncertainty on cross-scale crop modeling

    Science.gov (United States)

    Waha, Katharina; Huth, Neil; Carberry, Peter

    2014-05-01

    The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input

  16. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    in each measured/observed datapoint; an issue that is commonly overlooked in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...

  17. Safeguards as catastrophic risk management: insights and projections

    International Nuclear Information System (INIS)

    Leffer, T.N.

    2013-01-01

    The system of international agreements designed to prevent the use of nuclear weapons and to control the spread of nuclear weapons, materials and technologies (collectively referred to as the nuclear arms control and nonproliferation regimes) is posited as humanity.s first attempt to mitigate a man-made global catastrophic risk. By extrapolating general principles of government response to risk from the arms control and nonproliferation regimes, a model of international regime building for catastrophic risk mitigation is constructed. This model provides the context for an examination of the system of safeguards implemented by the International Atomic Energy Agency (IAEA), which serves as the nuclear nonproliferation regime.s verification and enforcement mechanism and thereby constitutes the regime's most completely developed discrete mechanism for risk mitigation (a 'system within a system'). An assessment of the history, evolution and effectiveness of the IAEA safeguards system in the context of the regimes-as-risk-mitigation model reveals some general principles for risk-mitigation regimes which are then applied to the safeguards system to identify ways in which it may be strengthened. Finally, the IAEA safeguards system is posited as the prototype verification/enforcement mechanism for future risk mitigation regimes that governments will be compelled to create in the face of new global catastrophic risks that technological advance will inevitably create. (author)

  18. Optimism Moderates the Influence of Pain Catastrophizing on Shoulder Pain Outcome: A Longitudinal Analysis.

    Science.gov (United States)

    Coronado, Rogelio A; Simon, Corey B; Lentz, Trevor A; Gay, Charles W; Mackie, Lauren N; George, Steven Z

    2017-01-01

    Study Design Secondary analysis of prospectively collected data. Background An abundance of evidence has highlighted the influence of pain catastrophizing and fear avoidance on clinical outcomes. Less is known about the interaction of positive psychological resources with these pain-associated distress factors. Objective To assess whether optimism moderates the influence of pain catastrophizing and fear avoidance on 3-month clinical outcomes in patients with shoulder pain. Methods Data from 63 individuals with shoulder pain (mean ± SD age, 38.8 ± 14.9 years; 30 female) were examined. Demographic, psychological, and clinical characteristics were obtained at baseline. Validated measures were used to assess optimism (Life Orientation Test-Revised), pain catastrophizing (Pain Catastrophizing Scale), fear avoidance (Fear-Avoidance Beliefs Questionnaire physical activity subscale), shoulder pain intensity (Brief Pain Inventory), and shoulder function (Pennsylvania Shoulder Score function subscale). Shoulder pain and function were reassessed at 3 months. Regression models assessed the influence of (1) pain catastrophizing and optimism and (2) fear avoidance and optimism. The final multivariable models controlled for factors of age, sex, education, and baseline scores, and included 3-month pain intensity and function as separate dependent variables. Results Shoulder pain (mean difference, -1.6; 95% confidence interval [CI]: -2.1, -1.2) and function (mean difference, 2.4; 95% CI: 0.3, 4.4) improved over 3 months. In multivariable analyses, there was an interaction between pain catastrophizing and optimism (β = 0.19; 95% CI: 0.02, 0.35) for predicting 3-month shoulder function (F = 16.8, R 2 = 0.69, Poptimism lessened the influence of pain catastrophizing on function. There was no evidence of significant moderation of fear-avoidance beliefs for 3-month shoulder pain (P = .090) or function (P = .092). Conclusion Optimism decreased the negative influence of pain

  19. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    CERN Document Server

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  20. Assessing uncertainty in extreme events: Applications to risk-based decision making in interdependent infrastructure sectors

    International Nuclear Information System (INIS)

    Barker, Kash; Haimes, Yacov Y.

    2009-01-01

    Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input-output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences

  1. RELAP5 simulation of surge line break accident using combined and best estimate plus uncertainty approaches

    International Nuclear Information System (INIS)

    Kristof, Marian; Kliment, Tomas; Petruzzi, Alessandro; Lipka, Jozef

    2009-01-01

    Licensing calculations in a majority of countries worldwide still rely on the application of combined approach using best estimate computer code without evaluation of the code models uncertainty and conservative assumptions on initial and boundary, availability of systems and components and additional conservative assumptions. However best estimate plus uncertainty (BEPU) approach representing the state-of-the-art in the area of safety analysis has a clear potential to replace currently used combined approach. There are several applications of BEPU approach in the area of licensing calculations, but some questions are discussed, namely from the regulatory point of view. In order to find a proper solution to these questions and to support the BEPU approach to become a standard approach for licensing calculations, a broad comparison of both approaches for various transients is necessary. Results of one of such comparisons on the example of the VVER-440/213 NPP pressurizer surge line break event are described in this paper. A Kv-scaled simulation based on PH4-SLB experiment from PMK-2 integral test facility applying its volume and power scaling factor is performed for qualitative assessment of the RELAP5 computer code calculation using the VVER-440/213 plant model. Existing hardware differences are identified and explained. The CIAU method is adopted for performing the uncertainty evaluation. Results using combined and BEPU approaches are in agreement with the experimental values in PMK-2 facility. Only minimal difference between combined and BEPU approached has been observed in the evaluation of the safety margins for the peak cladding temperature. Benefits of the CIAU uncertainty method are highlighted.

  2. Effects of a Pain Catastrophizing Induction on Sensory Testing in Women with Chronic Low Back Pain: A Pilot Study

    Directory of Open Access Journals (Sweden)

    Chloe J. Taub

    2017-01-01

    Full Text Available Pain catastrophizing, a pattern of negative cognitive-emotional responses to actual or anticipated pain, maintains chronic pain and undermines response to treatments. Currently, precisely how pain catastrophizing influences pain processing is not well understood. In experimental settings, pain catastrophizing has been associated with amplified pain processing. This study sought to clarify pain processing mechanisms via experimental induction of pain catastrophizing. Forty women with chronic low back pain were assigned in blocks to an experimental condition, either a psychologist-led 10-minute pain catastrophizing induction or a control (10-minute rest period. All participants underwent a baseline round of several quantitative sensory testing (QST tasks, followed by the pain catastrophizing induction or the rest period, and then a second round of the same QST tasks. The catastrophizing induction appeared to increase state pain catastrophizing levels. Changes in QST pain were detected for two of the QST tasks administered, weighted pin pain and mechanical allodynia. Although there is a need to replicate our preliminary results with a larger sample, study findings suggest a potential relationship between induced pain catastrophizing and central sensitization of pain. Clarification of the mechanisms through which catastrophizing affects pain modulatory systems may yield useful clinical insights into the treatment of chronic pain.

  3. Effects of a Pain Catastrophizing Induction on Sensory Testing in Women with Chronic Low Back Pain: A Pilot Study.

    Science.gov (United States)

    Taub, Chloe J; Sturgeon, John A; Johnson, Kevin A; Mackey, Sean C; Darnall, Beth D

    2017-01-01

    Pain catastrophizing, a pattern of negative cognitive-emotional responses to actual or anticipated pain, maintains chronic pain and undermines response to treatments. Currently, precisely how pain catastrophizing influences pain processing is not well understood. In experimental settings, pain catastrophizing has been associated with amplified pain processing. This study sought to clarify pain processing mechanisms via experimental induction of pain catastrophizing. Forty women with chronic low back pain were assigned in blocks to an experimental condition, either a psychologist-led 10-minute pain catastrophizing induction or a control (10-minute rest period). All participants underwent a baseline round of several quantitative sensory testing (QST) tasks, followed by the pain catastrophizing induction or the rest period, and then a second round of the same QST tasks. The catastrophizing induction appeared to increase state pain catastrophizing levels. Changes in QST pain were detected for two of the QST tasks administered, weighted pin pain and mechanical allodynia. Although there is a need to replicate our preliminary results with a larger sample, study findings suggest a potential relationship between induced pain catastrophizing and central sensitization of pain. Clarification of the mechanisms through which catastrophizing affects pain modulatory systems may yield useful clinical insights into the treatment of chronic pain.

  4. Effects of a Pain Catastrophizing Induction on Sensory Testing in Women with Chronic Low Back Pain: A Pilot Study

    Science.gov (United States)

    Sturgeon, John A.; Johnson, Kevin A.

    2017-01-01

    Pain catastrophizing, a pattern of negative cognitive-emotional responses to actual or anticipated pain, maintains chronic pain and undermines response to treatments. Currently, precisely how pain catastrophizing influences pain processing is not well understood. In experimental settings, pain catastrophizing has been associated with amplified pain processing. This study sought to clarify pain processing mechanisms via experimental induction of pain catastrophizing. Forty women with chronic low back pain were assigned in blocks to an experimental condition, either a psychologist-led 10-minute pain catastrophizing induction or a control (10-minute rest period). All participants underwent a baseline round of several quantitative sensory testing (QST) tasks, followed by the pain catastrophizing induction or the rest period, and then a second round of the same QST tasks. The catastrophizing induction appeared to increase state pain catastrophizing levels. Changes in QST pain were detected for two of the QST tasks administered, weighted pin pain and mechanical allodynia. Although there is a need to replicate our preliminary results with a larger sample, study findings suggest a potential relationship between induced pain catastrophizing and central sensitization of pain. Clarification of the mechanisms through which catastrophizing affects pain modulatory systems may yield useful clinical insights into the treatment of chronic pain. PMID:28348505

  5. Uncertainty Quantification Analysis of Both Experimental and CFD Simulation Data of a Bench-scale Fluidized Bed Gasifier

    Energy Technology Data Exchange (ETDEWEB)

    Shahnam, Mehrdad [National Energy Technology Lab. (NETL), Morgantown, WV (United States). Research and Innovation Center, Energy Conversion Engineering Directorate; Gel, Aytekin [ALPEMI Consulting, LLC, Phoeniz, AZ (United States); Subramaniyan, Arun K. [GE Global Research Center, Niskayuna, NY (United States); Musser, Jordan [National Energy Technology Lab. (NETL), Morgantown, WV (United States). Research and Innovation Center, Energy Conversion Engineering Directorate; Dietiker, Jean-Francois [West Virginia Univ. Research Corporation, Morgantown, WV (United States)

    2017-10-02

    Adequate assessment of the uncertainties in modeling and simulation is becoming an integral part of the simulation based engineering design. The goal of this study is to demonstrate the application of non-intrusive Bayesian uncertainty quantification (UQ) methodology in multiphase (gas-solid) flows with experimental and simulation data, as part of our research efforts to determine the most suited approach for UQ of a bench scale fluidized bed gasifier. UQ analysis was first performed on the available experimental data. Global sensitivity analysis performed as part of the UQ analysis shows that among the three operating factors, steam to oxygen ratio has the most influence on syngas composition in the bench-scale gasifier experiments. An analysis for forward propagation of uncertainties was performed and results show that an increase in steam to oxygen ratio leads to an increase in H2 mole fraction and a decrease in CO mole fraction. These findings are in agreement with the ANOVA analysis performed in the reference experimental study. Another contribution in addition to the UQ analysis is the optimization-based approach to guide to identify next best set of additional experimental samples, should the possibility arise for additional experiments. Hence, the surrogate models constructed as part of the UQ analysis is employed to improve the information gain and make incremental recommendation, should the possibility to add more experiments arise. In the second step, series of simulations were carried out with the open-source computational fluid dynamics software MFiX to reproduce the experimental conditions, where three operating factors, i.e., coal flow rate, coal particle diameter, and steam-to-oxygen ratio, were systematically varied to understand their effect on the syngas composition. Bayesian UQ analysis was performed on the numerical results. As part of Bayesian UQ analysis, a global sensitivity analysis was performed based on the simulation results, which shows

  6. Informal uncertainty analysis (GLUE of continuous flow simulation in a hybrid sewer system with infiltration inflow – consistency of containment ratios in calibration and validation?

    Directory of Open Access Journals (Sweden)

    A. Breinholt

    2013-10-01

    Full Text Available Monitoring of flows in sewer systems is increasingly applied to calibrate urban drainage models used for long-term simulation. However, most often models are calibrated without considering the uncertainties. The generalized likelihood uncertainty estimation (GLUE methodology is here applied to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction limits without statistical coherence and consistency and for the subjectivity in the choice of a threshold value to distinguish "behavioural" from "non-behavioural" parameter sets. In this paper we examine how well the GLUE methodology performs when the behavioural parameter sets deduced from a calibration period are applied to generate prediction bounds in validation periods. By retaining an increasing number of parameter sets we aim at obtaining consistency between the GLUE generated 90% prediction limits and the actual containment ratio (CR in calibration. Due to the large uncertainties related to spatio-temporal rain variability during heavy convective rain events, flow measurement errors, possible model deficiencies as well as epistemic uncertainties, it was not possible to obtain an overall CR of more than 80%. However, the GLUE generated prediction limits still proved rather consistent, since the overall CRs obtained in calibration corresponded well with the overall CRs obtained in validation periods for all proportions of retained parameter sets evaluated. When focusing on wet and dry weather periods separately, some inconsistencies were however found between calibration and validation and we address here some of the reasons why we should not expect the coverage of the prediction limits to be identical in calibration and validation periods in real

  7. Living with uncertainty: from the precautionary principle to the methodology of ongoing normative assessment

    International Nuclear Information System (INIS)

    Dupuy, J.P.; Grinbaum, A.

    2005-01-01

    The analysis of our epistemic situation regarding singular events, such as abrupt climate change, shows essential limitations in the traditional modes of dealing with uncertainty. Typical cognitive barriers lead to the paralysis of action. What is needed is taking seriously the reality of the future. We argue for the application of the methodology of ongoing normative assessment. We show that it is, paradoxically, a matter of forming a project on the basis of a fixed future which one does not want, and this in a coordinated way at the level of social institutions. Ongoing assessment may be viewed as a prescription to live with uncertainty, in a particular sense of the term, in order for a future catastrophe not to occur. The assessment is necessarily normative in that it must include the anticipation of a retrospective ethical judgment on present choices (notion of moral luck). (authors)

  8. A formal statistical approach to representing uncertainty in rainfall-runoff modelling with focus on residual analysis and probabilistic output evaluation - Distinguishing simulation and prediction

    DEFF Research Database (Denmark)

    Breinholt, Anders; Møller, Jan Kloppenborg; Madsen, Henrik

    2012-01-01

    While there seems to be consensus that hydrological model outputs should be accompanied with an uncertainty estimate the appropriate method for uncertainty estimation is not agreed upon and a debate is ongoing between advocators of formal statistical methods who consider errors as stochastic...... and GLUE advocators who consider errors as epistemic, arguing that the basis of formal statistical approaches that requires the residuals to be stationary and conform to a statistical distribution is unrealistic. In this paper we take a formal frequentist approach to parameter estimation and uncertainty...... necessary but the statistical assumptions were nevertheless not 100% justified. The residual analysis showed that significant autocorrelation was present for all simulation models. We believe users of formal approaches to uncertainty evaluation within hydrology and within environmental modelling in general...

  9. ARIANNE. Analytical uncertainties. Simulation of influential factors in the inventory of the final web cam; ARIANNE. Incertidumbres analiticas. Factores de simulacion influyentes en el inventario de la isotopia final

    Energy Technology Data Exchange (ETDEWEB)

    Morales Prieto, M.; Ortega Saiz, P.

    2011-07-01

    Analysis of analytical uncertainties of the methodology of simulation of processes for obtaining isotopic ending inventory of spent fuel, the ARIANE experiment explores the part of simulation of burning.

  10. Assessing catastrophic and impoverishing effects of health care payments in Uganda

    OpenAIRE

    Kwesiga, Brendan; Zikusooka, Charlotte M; Ataguba, John E

    2015-01-01

    Background Direct out-of-pocket payments for health care are recognised as limiting access to health care services and also endangering the welfare of households. In Uganda, such payments comprise a large portion of total health financing. This study assesses the catastrophic and impoverishing impact of paying for health care out-of-pocket in Uganda. Methods Using data from the Uganda National Household Surveys 2009/10, the catastrophic impact of out-of-pocket health care payments is defined ...

  11. Modeling workplace bullying using catastrophe theory.

    Science.gov (United States)

    Escartin, J; Ceja, L; Navarro, J; Zapf, D

    2013-10-01

    Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.

  12. Merging Methods to Manage Uncertainty: Combining Simulation Modeling and Scenario Planning to Inform Resource Management Under Climate Change

    Science.gov (United States)

    Miller, B. W.; Schuurman, G. W.; Symstad, A.; Fisichelli, N. A.; Frid, L.

    2017-12-01

    Managing natural resources in this era of anthropogenic climate change is fraught with uncertainties around how ecosystems will respond to management actions and a changing climate. Scenario planning (oftentimes implemented as a qualitative, participatory exercise for exploring multiple possible futures) is a valuable tool for addressing this challenge. However, this approach may face limits in resolving responses of complex systems to altered climate and management conditions, and may not provide the scientific credibility that managers often require to support actions that depart from current practice. Quantitative information on projected climate changes and ecological responses is rapidly growing and evolving, but this information is often not at a scale or in a form that is `actionable' for resource managers. We describe a project that sought to create usable information for resource managers in the northern Great Plains by combining qualitative and quantitative methods. In particular, researchers, resource managers, and climate adaptation specialists co-produced a simulation model in conjunction with scenario planning workshops to inform natural resource management in southwest South Dakota. Scenario planning for a wide range of resources facilitated open-minded thinking about a set of divergent and challenging, yet relevant and plausible, climate scenarios and management alternatives that could be implemented in the simulation. With stakeholder input throughout the process, we built a simulation of key vegetation types, grazing, exotic plants, fire, and the effects of climate and management on rangeland productivity and composition. By simulating multiple land management jurisdictions, climate scenarios, and management alternatives, the model highlighted important tradeoffs between herd sizes and vegetation composition, and between the short- versus long-term costs of invasive species management. It also identified impactful uncertainties related to the

  13. Forensic Uncertainty Quantification of Explosive Dispersal of Particles

    Science.gov (United States)

    Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho

    2017-06-01

    In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.

  14. SIMULATION OF CARS ACCUMULATION PROCESSES FOR SOLVING TASKS OF OPERATIONAL PLANNING IN CONDITIONS OF INITIAL INFORMATION UNCERTAINTY

    Directory of Open Access Journals (Sweden)

    О. A. Tereshchenko

    2017-06-01

    Full Text Available Purpose. The article highlights development of the methodological basis for simulation the processes of cars accumulation in solving operational planning problems under conditions of initial information uncertainty for assessing the sustainability of the adopted planning scenario and calculating the associated technological risks. Methodology. The solution of the problem under investigation is based on the use of general scientific approaches, the apparatus of probability theory and the theory of fuzzy sets. To achieve this purpose, the factors influencing the entropy of operational plans are systematized. It is established that when planning the operational work of railway stations, sections and nodes, the most significant factors that cause uncertainty in the initial information are: a external conditions with respect to the railway ground in question, expressed by the uncertainty of the timing of cars arrivals; b external, hard-to-identify goals for the functioning of other participants in the logistics chain (primarily customers, expressed by the uncertainty of the completion time with the freight cars. These factors are suggested to be taken into account in automated planning through statistical analysis – the establishment and study of the remaining time (prediction errors. As a result, analytical dependencies are proposed for rational representation of the probability density functions of the time residual distribution in the form of point, piecewise-defined and continuous analytic models. The developed models of cars accumulation, the application of which depends on the identified states of the predicted incoming car flow to the accumulation system, are presented below. In addition, the last proposed model is a general case of models of accumulation processes with an arbitrary level of reliability of the initial information for any structure of the incoming flow of cars. In conclusion, a technique for estimating the results of

  15. Catastrophic health expenditure and its determinants in Kenya slum communities.

    Science.gov (United States)

    Buigut, Steven; Ettarh, Remare; Amendah, Djesika D

    2015-05-14

    In Kenya, where 60 to 80% of the urban residents live in informal settlements (frequently referred to as slums), out-of-pocket (OOP) payments account for more than a third of national health expenditures. However, little is known on the extent to which these OOP payments are associated with personal or household financial catastrophe in the slums. This paper seeks to examine the incidence and determinants of catastrophic health expenditure among urban slum communities in Kenya. We use a unique dataset on informal settlement residents in Kenya and various approaches that relate households OOP payments for healthcare to total expenditures adjusted for subsistence, or income. We classified households whose OOP was in excess of a predefined threshold as facing catastrophic health expenditures (CHE), and identified the determinants of CHE using multivariate logistic regression analysis. The results indicate that the proportion of households facing CHE varies widely between 1.52% and 28.38% depending on the method and the threshold used. A core set of variables were found to be key determinants of CHE. The number of working adults in a household and membership in a social safety net appear to reduce the risk of catastrophic expenditure. Conversely, seeking care in a public or private hospital increases the risk of CHE. This study suggests that a substantial proportion of residents of informal settlements in Kenya face CHE and would likely forgo health care they need but cannot afford. Mechanisms that pool risk and cost (insurance) are needed to protect slum residents from CHE and improve equity in health care access and payment.

  16. Errors and uncertainties introduced by a regional climate model in climate impact assessments: example of crop yield simulations in West Africa

    International Nuclear Information System (INIS)

    Ramarohetra, Johanna; Pohl, Benjamin; Sultan, Benjamin

    2015-01-01

    The challenge of estimating the potential impacts of climate change has led to an increasing use of dynamical downscaling to produce fine spatial-scale climate projections for impact assessments. In this work, we analyze if and to what extent the bias in the simulated crop yield can be reduced by using the Weather Research and Forecasting (WRF) regional climate model to downscale ERA-Interim (European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis) rainfall and radiation data. Then, we evaluate the uncertainties resulting from both the choice of the physical parameterizations of the WRF model and its internal variability. Impact assessments were performed at two sites in Sub-Saharan Africa and by using two crop models to simulate Niger pearl millet and Benin maize yields. We find that the use of the WRF model to downscale ERA-Interim climate data generally reduces the bias in the simulated crop yield, yet this reduction in bias strongly depends on the choices in the model setup. Among the physical parameterizations considered, we show that the choice of the land surface model (LSM) is of primary importance. When there is no coupling with a LSM, or when the LSM is too simplistic, the simulated precipitation and then the simulated yield are null, or respectively very low; therefore, coupling with a LSM is necessary. The convective scheme is the second most influential scheme for yield simulation, followed by the shortwave radiation scheme. The uncertainties related to the internal variability of the WRF model are also significant and reach up to 30% of the simulated yields. These results suggest that regional models need to be used more carefully in order to improve the reliability of impact assessments. (letter)

  17. Uncertainty propagation in probabilistic risk assessment: A comparative study

    International Nuclear Information System (INIS)

    Ahmed, S.; Metcalf, D.R.; Pegram, J.W.

    1982-01-01

    Three uncertainty propagation techniques, namely method of moments, discrete probability distribution (DPD), and Monte Carlo simulation, generally used in probabilistic risk assessment, are compared and conclusions drawn in terms of the accuracy of the results. For small uncertainty in the basic event unavailabilities, the three methods give similar results. For large uncertainty, the method of moments is in error, and the appropriate method is to propagate uncertainty in the discrete form either by DPD method without sampling or by Monte Carlo. (orig.)

  18. Catastrophic Economic Consequences of Healthcare Payments: Effects on Poverty Estimates in Egypt, Jordan, and Palestine

    Directory of Open Access Journals (Sweden)

    Ahmed Shoukry Rashad

    2015-11-01

    Full Text Available Healthcare payments could drive households with no health insurance coverage into financial catastrophe, which might lead them to cut spending on necessities, sell assets, or use credit. In extreme cases, healthcare payments could have devastating consequences on the household economic status that would push them into extreme poverty. Using nationally representative surveys from three Arab countries, namely, Egypt, Jordan, and Palestine, this paper examines the incidence, intensity and distribution of catastrophic health payments, and assesses the poverty impact of out-of-pocket health payments (OOP. The OOP for healthcare were considered catastrophic if it exceeded 10% of a household’s total expenditure or 40% of non-food expenditure. The poverty impact was evaluated using poverty head counts and poverty gaps before and after OOP. Results show that OOP exacerbate households’ living severely in Egypt, pushing more than one-fifth of the population into a financial catastrophe and 3% into extreme poverty in 2011. However, in Jordan and Palestine, the disruptive impact of OOP remains modest over time. In the three countries, the catastrophic health payment is the problem of the better off households. Poverty alleviation policies should help reduce the reliance on OOP to finance healthcare. Moving toward universal health coverage could also be a promising option to protect households from the catastrophic economic consequences of health care payments.

  19. Uncertainty Quantification in Alchemical Free Energy Methods.

    Science.gov (United States)

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-05-02

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  20. Chernobyl catastrophe: Information for people living in the contaminated areas

    International Nuclear Information System (INIS)

    Borisevich, Nikolaj

    2001-01-01

    The radioactive blow-outs after the Chernobyl Nuclear Power Plant catastrophe reached many states. The largest amount of them (according to experts' estimations - 70%) fell out on the Belarus territory. The estimation of radioecological, medico-biological, economic and social consequences of the Chernobyl catastrophe has shown that unimaginable damage was incurred on Belarus and its territory became the zone of ecological calamity. More than 14 years have passed since the Chernobyl NPP accident but some of the problems caused by the catastrophe have not been solved. This is bound up, first of all, with a high collective dosage absorbed by the population, with difficulties in forecasting and prophylactics of remote radiological effects, with ecological and economic crisis. The consequences of the disaster greatly affect all the aspects of vital activities of the affected regions and the state as a whole. Destructive tendencies have been revealed in all spheres of the life activity of people who experienced radiation effects. The processes of social adaptation and socio-psychological support of the population inhabiting the contaminated territory and resettled as well, require considerable optimisation. Negative factors of the Chernobyl catastrophe, which are significant for human health can be divided into two groups as follows: radiation-based, directly related to influence of ionising radiation and non radiation based, related to changes in habitat and prolonged psychological stress. The specific peculiarities of psychogenic disorders caused by the catastrophe are determined by the following reasons: insufficient knowledge of radiation effects; constant apprehension for the health and well-being of themselves and their families, especially children; unexpected change of the life stereotype (forced resettlement, the break of the former life, changing the place and the character of work, etc.); the necessity of constant keeping precaution measures and prophylactic

  1. Catastrophic antiphospholipid syndrome: task force report summary.

    Science.gov (United States)

    Cervera, R; Rodríguez-Pintó, I

    2014-10-01

    The Task Force on Catastrophic Antiphospholipid Syndrome (CAPS) aimed to assess the current knowledge on pathogenesis, clinical and laboratory features, diagnosis and classification, precipitating factors and treatment of CAPS. This article summarizes the main aspects of its final report. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  2. Uncertainty estimation and global forecasting with a chemistry-transport model - application to the numerical simulation of air quality; Estimation de l'incertitude et prevision d'ensemble avec un modele de chimie transport - Application a la simulation numerique de la qualite de l'air

    Energy Technology Data Exchange (ETDEWEB)

    Mallet, V.

    2005-12-15

    The aim of this work is the evaluation of the quality of a chemistry-transport model, not by a classical comparison with observations, but by the estimation of its uncertainties due to the input data, to the model formulation and to the numerical approximations. The study of these 3 sources of uncertainty is carried out with Monte Carlo simulations, with multi-model simulations and with comparisons between numerical schemes, respectively. A high uncertainty is shown for ozone concentrations. To overcome the uncertainty-related limitations, a strategy consists in using the overall forecasting. By combining several models (up to 48) on the basis of past observations, forecasts can be significantly improved. This work has been also the occasion of developing an innovative modeling system, named Polyphemus. (J.S.)

  3. A Study on the uncertainty and sensitivity in numerical simulation of parametric roll

    DEFF Research Database (Denmark)

    Choi, Ju-hyuck; Nielsen, Ulrik Dam; Jensen, Jørgen Juncher

    2016-01-01

    Uncertainties related to numerical modelling of parametric roll have been investigated by using a 6-DOFs model with nonlinear damping and roll restoring forces. At first, uncertainty on damping coefficients and its effect on the roll response is evaluated. Secondly, uncertainty due to the “effect...

  4. FREQUENCY CATASTROPHE AND CO-EXISTING ATTRACTORS IN A CELL Ca2+ NONLINEAR OSCILLATION MODEL WITH TIME DELAY*

    Institute of Scientific and Technical Information of China (English)

    应阳君; 黄祖洽

    2001-01-01

    Frequency catastrophe is found in a cell Ca2+ nonlinear oscillation model with time delay. The relation of the frequency transition to the time delay is studied by numerical simulations and theoretical analysis. There is a range of parameters in which two kinds of attractors with great frequency differences co-exist in the system. Along with parameter changes, a critical phenomenon occurs and the oscillation frequency changes greatly. This mechanism helps us to deepen the understanding of the complex dynamics of delay systems, and might be of some meaning in cell signalling.

  5. Vaginismus : Heightened Harm Avoidance and Pain Catastrophizing Cognitions

    NARCIS (Netherlands)

    Borg, Charmaine; Peters, Madelon L.; Schultz, Willibrord Weijmar; de Jong, Peter J.

    Introduction. Catastrophic appraisal of experienced pain may promote hypervigilance and intense pain, while the personality trait of harm avoidance (HA) might prevent the occurrence of correcting such experiences. Women inflicted with vaginismus may enter a self-perpetuating downward spiral of

  6. The Climate Catastrophe as Blockbuster

    DEFF Research Database (Denmark)

    Eskjær, Mikkel Fugl

    2013-01-01

    Modern disaster films constitute a specific cultural form that speaks to the anxieties of the “risk society.” This essay looks at how risks like climate change is presented and constructed in popular culture. It regards blockbuster representations as part of a wider discourse of “catastrophism......” within the realm of public climate change communication. For that reason, the essay centers on the interplay between news media and entertainment. It argues that blockbuster disaster films represent an inversion of traditional risk and disaster news....

  7. Safety, danger and catastrophe inevitability in operation of safety-critical software algorithms: a possible new look at software safety analysis

    International Nuclear Information System (INIS)

    Povyakalo, A.A.

    2000-01-01

    The paper provides basic definitions and describes the basic procedure of the Formal Qualitative Safety Analysis (FQSA) of critical software algorithms. The procedure is described by C-based pseudo-code. It uses the notion of weakest precondition and representation of a given critical algorithm by a Gurevich's Abstract State Mashine (GASM). For a given GASM and a given Catastrophe Condition the procedure results in a Catastrophe Inevitability Condition (it means that every sequence of algorithm steps lead to a catastrophe early or late), Danger Condition (it means that next step may lead to a catastrophe or make a catastrophe to be inevitable, but a catastrophe may be prevented yet), Safety Condition (it means that a next step can not lead to a catastrophe or make a catastrophe to be inevitable). The using of proposed procedure is illustrated by a simplest test example of algorithm. The FQSA provides a logical basis for PSA of critical algorithm. (author)

  8. Uncertainty in Twenty-First-Century CMIP5 Sea Level Projections

    Science.gov (United States)

    Little, Christopher M.; Horton, Radley M.; Kopp, Robert E.; Oppenheimer, Michael; Yip, Stan

    2015-01-01

    The representative concentration pathway (RCP) simulations included in phase 5 of the Coupled Model Intercomparison Project (CMIP5) quantify the response of the climate system to different natural and anthropogenic forcing scenarios. These simulations differ because of 1) forcing, 2) the representation of the climate system in atmosphere-ocean general circulation models (AOGCMs), and 3) the presence of unforced (internal) variability. Global and local sea level rise projections derived from these simulations, and the emergence of distinct responses to the four RCPs depend on the relative magnitude of these sources of uncertainty at different lead times. Here, the uncertainty in CMIP5 projections of sea level is partitioned at global and local scales, using a 164-member ensemble of twenty-first-century simulations. Local projections at New York City (NYSL) are highlighted. The partition between model uncertainty, scenario uncertainty, and internal variability in global mean sea level (GMSL) is qualitatively consistent with that of surface air temperature, with model uncertainty dominant for most of the twenty-first century. Locally, model uncertainty is dominant through 2100, with maxima in the North Atlantic and the Arctic Ocean. The model spread is driven largely by 4 of the 16 AOGCMs in the ensemble; these models exhibit outlying behavior in all RCPs and in both GMSL and NYSL. The magnitude of internal variability varies widely by location and across models, leading to differences of several decades in the local emergence of RCPs. The AOGCM spread, and its sensitivity to model exclusion and/or weighting, has important implications for sea level assessments, especially if a local risk management approach is utilized.

  9. Catastrophic Outcomes in Free Tissue Transfer: A Six-Year Review of the NSQIP Database

    Directory of Open Access Journals (Sweden)

    David W. Grant

    2014-01-01

    Full Text Available Background. No studies report robust data on the national incidence and risk factors associated with catastrophic medical outcomes following free tissue transfer. Methods. The American College of Surgeons (ACS multicenter, prospective National Surgical Quality Improvement Program (NSQIP database was used to identify patients who underwent free tissue transfer between 2006 and 2011. Multivariable logistic regression was used for statistical analysis. Results. Over the 6-year study period 2,349 patients in the NSQIP database underwent a free tissue transfer procedure. One hundred and twenty-two patients had at least one catastrophic medical outcome (5.2%. These 122 patients had 151 catastrophic medical outcomes, including 93 postoperative respiratory failure events (4.0%, 14 pulmonary emboli (0.6%, 13 septic shock events (0.5%, 12 myocardial infarctions (0.5%, 6 cardiac arrests (0.3%, 4 strokes (0.2%, 1 coma (0.0%, and 8 deaths (0.3%. Total length of hospital stay was on average 14.7 days longer for patients who suffered a catastrophic medical complication (P<0.001. Independent risk factors were identified. Conclusions. Free tissue transfer is a proven and safe technique. Catastrophic medical complications were infrequent but added significantly to length of hospital stay and patient morbidity.

  10. Flood Simulations and Uncertainty Analysis for the Pearl River Basin Using the Coupled Land Surface and Hydrological Model System

    Directory of Open Access Journals (Sweden)

    Yongnan Zhu

    2017-06-01

    Full Text Available The performances of hydrological simulations for the Pearl River Basin in China were analysed using the Coupled Land Surface and Hydrological Model System (CLHMS. Three datasets, including East Asia (EA, high-resolution gauge satellite-merged China Merged Precipitation Analysis (CMPA-Daily, and the Asian Precipitation Highly-Resolved Observational Data Integration Towards Evaluation (APHRODITE daily precipitation were used to drive the CLHMS model to simulate daily hydrological processes from 1998 to 2006. The results indicate that the precipitation data was the most important source of uncertainty in the hydrological simulation. The simulated streamflow driven by the CMPA-Daily agreed well with observations, with a Pearson correlation coefficient (PMC greater than 0.70 and an index of agreement (IOA similarity coefficient greater than 0.82 at Liuzhou, Shijiao, and Wuzhou Stations. Comparison of the Nash-Sutcliffe efficiency coefficient (NSE shows that the peak flow simulation ability of CLHMS driven with the CMPA-Daily rainfall is relatively superior to that with the EA and APHRODITE datasets. The simulation results for the high-flow periods in 1998 and 2005 indicate that the CLHMS is promising for its future application in the flood simulation and prediction.

  11. Catastrophe Bonds. From Structure to Strategy – A Cluster Analysis at European Level

    Directory of Open Access Journals (Sweden)

    Laura-Gabriela CONSTANTIN

    2014-12-01

    Full Text Available As a core activity and discipline of corporate management and corporate governance, risk management is, especially nowadays, a central part in pursuing the sustainable development desiderates, both from the perspective of the firm and of the society as a whole.Considering the negative impact natural catastrophes have on the companies’ and countries’ competitiveness, the development of sustainable financial products that make a contribution to transferring the risk and allocating the capital in case of disasters stands for a continual preoccupation, especially for the (reinsurance industry, while the study of catastrophe bonds – insurance-linked securities – is of interest in the specialized literature. In this context, the scope of the present research is to expand the empirical studies within this field while examining the link between the structure of the catastrophe bonds and the risk management approach employed while accessing the capital markets through this transactions.The methodology entailed clustering a selection of transactions developed by European cedents based on the size of each issue and correlating the results with an innovative score, developed to encompass several important catastrophe bonds structural components.The findings reflect that the general structural elements of the financial transactions reflect closely the corporate approach regarding the innovative risk intermediation instruments for the examined catastrophe bonds deals. The outcomes also emphasize, as expected, that companies with a stronger presence on this market seem to have a more sophisticated risk management approach.

  12. Remembering pain after surgery: a longitudinal examination of the role of pain catastrophizing in children's and parents' recall.

    Science.gov (United States)

    Noel, Melanie; Rabbitts, Jennifer A; Tai, Gabrielle G; Palermo, Tonya M

    2015-05-01

    Children's memories for pain play a powerful role in their pain experiences. Parents' memories may also influence children's pain experiences, by influencing parent-child interactions about pain and children's cognitions and behaviors. Pain catastrophizing of children and parents has been implicated as a factor underlying memory biases; however, this has not been empirically examined. The current longitudinal study is the first to examine the role of pain catastrophizing of children and parents in the development of their pain memories after surgery. Participants were 49 youth (32 girls) aged 10 to 18 years undergoing major surgery and their parents. One week before surgery, children and parents completed measures of pain catastrophizing. Two weeks after surgery (the acute recovery period), children and parents completed measures of child pain intensity and affect. Two to 4 months after surgery, children's and parents' memories of child pain intensity and affect were elicited. Hierarchical linear regression models revealed that over and above covariates, parent catastrophizing about their child's pain (magnification, rumination) accounted for a significant portion of variance in children's affective and parents' sensory pain memories. Although parent catastrophizing had a direct effect on pain memories, mediation analyses revealed that child catastrophizing (helplessness) indirectly influenced children's and parents' pain memories through the child's postoperative pain experience. Findings highlight that aspects of catastrophic thinking about child pain before surgery are linked to distressing pain memories several months later. Although both child and parent catastrophizing influence pain memory development, parent catastrophizing is most influential to both children's and parents' evolving cognitions about child pain.

  13. Reduction methods and uncertainty analysis: application to a Chemistry-Transport Model for modeling and simulation of impacts

    International Nuclear Information System (INIS)

    Boutahar, Jaouad

    2004-01-01

    In an integrated impact assessment, one has to test several scenarios of the model inputs or/and to identify the effects of model input uncertainties on the model outputs. In both cases, a large number of simulations of the model is necessary. That of course is not feasible with comprehensive Chemistry-Transport Model, due to the need for huge CPU times. Two approaches may be used in order to circumvent these difficulties: The first approach consists in reducing the computational cost of the original model by building a reduced model. Two reduction techniques are used: the first method, POD, is related to the statistical behaviour of the system and is based on a proper orthogonal decomposition of the solutions. The second method, is an efficient representation of the input/output behaviour through look-up tables. It describes the output model as an expansion of finite hierarchical correlated function in terms of the input variables. The second approach is based on reducing the number of models runs required by the standard Monte Carlo methods. It characterizes the probabilistic response of the uncertain model output as an expansion of orthogonal polynomials according to model inputs uncertainties. Then the classical Monte Carlo simulation can easily be used to compute the probability density of the uncertain output. Another key point in an integrated impact assessment is to develop strategies for the reduction of emissions by computing Source/Receptor matrices for several years of simulations. We proposed here an efficient method to calculate these matrices by using the adjoint model and in particular by defining the 'representative chemical day'. All of these methods are applied to POLAIR3D, a Chemistry-Transport model developed in this thesis. (author) [fr

  14. Timber price dynamics following a natural catastrophe

    Science.gov (United States)

    Jeffrey P. Prestemon; Thomas P. Holmes

    2000-01-01

    Catastrophic shocks to existing stocks of a renewable resource can cause long-run price shifts. With timber, these long-run price shifts may be accompanied by a short-run price drop due to salvage. Hurricane Hugo damaged 20 percent of southern pine timber in the South Carolina Coastal Plain in 1989. To estimate the...

  15. The 1985 Nevado del Ruiz volcano catastrophe: anatomy and retrospection

    Science.gov (United States)

    Voight, Barry

    1990-12-01

    This paper seeks to analyze in an objective way the circumstances and events that contributed to the 1985 Nevado del Ruiz catastrophe, in order to provide useful guidelines for future emergencies. The paper is organized into two principal parts. In the first part, an Anatomy of the catastrophe is developed as a step-by-step chronicle of events and actions taken by individuals and organizations during the period November 1984 through November 1985. This chronicle provides the essential background for the crucial events of November 13. This year-long period is broken down further to emphasize important chapters: the gradual awareness of the awakening of the volcano; a long period of institutional skepticism reflecting an absence of credibility; the closure of the credibility gap with the September 11 phreatic eruption, followed by an intensive effort to gird for the worst; and a detailed account of the day of reckoning. The second part of the paper, Retrospection, examines the numerous complicated factors that influenced the catastrophic outcome, and attempts to cull a few "lessons from Armero" in order to avoid similar occurrences in the future. In a nutshell, the government on the whole acted responsibly but was not willing to bear the economic or political costs of early evacuation or a false alarm. Science accurately foresaw the hazards but was insufficiently precise to render reliable warning of the crucial event at the last possible minute. Catastrophe was therefore a calculated risk, and this combination - the limitations of prediction/detection, the refusal to bear a false alarm and the lack of will to act on the uncertain information available - provided its immediate and most obvious causes. But because the crucial event occurred just two days before the Armero emergency management plan was to be critically examined and improved, the numerous circumstances which delayed progress of emergency management over the previous year also may be said to have

  16. Evaluation of simulated corn yields and associated uncertainty in different climate zones of China using Daycent Model

    Science.gov (United States)

    Fu, A.; Xue, Y.

    2017-12-01

    Corn is one of most important agricultural production in China. Research on the simulation of corn yields and the impacts of climate change and agricultural management practices on corn yields is important in maintaining the stable corn production. After climatic data including daily temperature, precipitation, solar radiation, relative humidity, and wind speed from 1948 to 2010, soil properties, observed corn yields, and farmland management information were collected, corn yields grown in humidity and hot environment (Sichuang province) and cold and dry environment (Hebei province) in China in the past 63 years were simulated by Daycent, and the results was evaluated based on published yield record. The relationship between regional climate change, global warming and corn yield were analyzed, the uncertainties of simulation derived from agricultural management practices by changing fertilization levels, land fertilizer maintenance and tillage methods were reported. The results showed that: (1) Daycent model is capable to simulate corn yields under the different climatic background in China. (2) When studying the relationship between regional climate change and corn yields, it has been found that observed and simulated corn yields increased along with total regional climate change. (3) When studying the relationship between the global warming and corn yields, It was discovered that newly-simulated corn yields after removing the global warming trend of original temperature data were lower than before.

  17. Effects of a Pain Catastrophizing Induction on Sensory Testing in Women with Chronic Low Back Pain: A Pilot Study

    OpenAIRE

    Taub, Chloe J.; Sturgeon, John A.; Johnson, Kevin A.; Mackey, Sean C.; Darnall, Beth D.

    2017-01-01

    Pain catastrophizing, a pattern of negative cognitive-emotional responses to actual or anticipated pain, maintains chronic pain and undermines response to treatments. Currently, precisely how pain catastrophizing influences pain processing is not well understood. In experimental settings, pain catastrophizing has been associated with amplified pain processing. This study sought to clarify pain processing mechanisms via experimental induction of pain catastrophizing. Forty women with chronic l...

  18. Death, Catastrophe, and the Significance of Tragedy

    Directory of Open Access Journals (Sweden)

    Jennifer Ballengee

    2014-05-01

    Full Text Available This NANO note will examine the tension between representation, memorial, and the catastrophe of death that emerges in the space of tragedy, as the problem arises in two quite different works: Oedipus at Colonus, a fairly typical fifth-century Greek tragedy, and Falling Man, Don DeLillo’s novel that, in its attempt to address the events of 9/11, reflects in form and subject matter many of Aristotle’s terms of tragic representation. It is not the intent of this note to engage with the recent proliferation of work in “performance theory.” Rather than being concerned with an imagined exchange between audience and actor, this study examines how the supplementary relationship of gesture and speech in tragedy disrupts the public/private distinction, and how this articulation effects and enables the public memorialization of death. Thus, this paper will consider the representation of death as an event whose catastrophic, and somewhat mysterious, collision of the public and the private lends it its tragic significance.

  19. Catastrophic failure in complex socio-technical systems

    International Nuclear Information System (INIS)

    Weir, D.

    2004-01-01

    This paper reviews the sequences leading to catastrophic failures in complex socio-technical systems. It traces some of the elements of an analytic framework to that proposed by Beer in Decision and Control, first published in 1966, and argues that these ideas are centrally relevant to a topic on which research interest has developed subsequently, the study of crises, catastrophes and disasters in complex socio-technical systems in high technology sectors. But while the system perspective is central, it is not by itself entirely adequate. The problems discussed cannot be discussed simply in terms of system parameters like variety, redundancy and complexity. Much empirical research supports the view that these systems typically operate in degraded mode. The degradations may be primarily initiated within the social components of the socio-technical system. Such variables as hierarchical position, actors' motivations and intentions are relevant to explain the ways in which communication systems typically operate to filter out messages from lower participants and to ignore the 'soft signals' issuing from small-scale and intermittent malfunctions. (author)

  20. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.

    2013-01-01

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  1. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  2. Reverse Catastrophe

    Directory of Open Access Journals (Sweden)

    Przemysław Czapliński

    2015-01-01

    Full Text Available The principal notion of the article–a “backward catastrophe”– stands for a catastrophe which occurs unseen until it becomes recognized and which broadens its destructive activity until it has been recognized. This concept in the article has been referred to the Shoah. The main thesis is that the recognition of the actual influence of the Holocaust began in Polish culture in the mid-1980s (largely it started with the film by Claude Lanzmann Shoah and the essay by Jan Błoński Biedni Polacy patrzą na getto [“The Poor Poles Look at the Ghetto”], that is when the question: “What happened to the Jews”, assumes the form: “Did the things that happened to the Jews, also happened to the Poles?”. Cognitive and ethical reorientation leads to the revealing of the hidden consequences of the Holocaust reaching as far as the present day and undermining the foundations of collective identity. In order to understand this situation (and adopt potentially preventive actions Polish society should be recognized as a postcatastrophic one.

  3. Sources of uncertainty in flood inundation maps

    Science.gov (United States)

    Bales, J.D.; Wagner, C.R.

    2009-01-01

    Flood inundation maps typically have been used to depict inundated areas for floods having specific exceedance levels. The uncertainty associated with the inundation boundaries is seldom quantified, in part, because all of the sources of uncertainty are not recognized and because data available to quantify uncertainty seldom are available. Sources of uncertainty discussed in this paper include hydrologic data used for hydraulic model development and validation, topographic data, and the hydraulic model. The assumption of steady flow, which typically is made to produce inundation maps, has less of an effect on predicted inundation at lower flows than for higher flows because more time typically is required to inundate areas at high flows than at low flows. Difficulties with establishing reasonable cross sections that do not intersect and that represent water-surface slopes in tributaries contribute additional uncertainties in the hydraulic modelling. As a result, uncertainty in the flood inundation polygons simulated with a one-dimensional model increases with distance from the main channel.

  4. Nuclear war and other catastrophes. Civil and catastrophe protection in the Federal republic of Germany and the United Kingdom after 1945; Atomkrieg und andere Katastrophen. Zivil- und Katastrophenschutz in der Bundesrepublik und Grossbritannien nach 1945

    Energy Technology Data Exchange (ETDEWEB)

    Diebel, Martin [Zentrum fuer Zeithistorische Forschung, Potsdam (Germany)

    2017-07-01

    The book civil and catastrophe protection in the Federal republic of Germany and the United Kingdom after 1945 discusses the following issues: aerial defense and the atomic bomb (1945 - 1968), crises and catastrophes in the shadow of the bomb (1962 - 1978), civil defense and the comeback of the (nuclear) war (1976 - 1979), civil defense and the second ''Cold War'' (1979 - 1986), Chernobyl and the end of the Cold War (1979 - 1990), war, catastrophe and safety in the 20th century - a conclusion.

  5. Multi objective multi refinery optimization with environmental and catastrophic failure effects objectives

    Science.gov (United States)

    Khogeer, Ahmed Sirag

    2005-11-01

    Petroleum refining is a capital-intensive business. With stringent environmental regulations on the processing industry and declining refining margins, political instability, increased risk of war and terrorist attacks in which refineries and fuel transportation grids may be targeted, higher pressures are exerted on refiners to optimize performance and find the best combination of feed and processes to produce salable products that meet stricter product specifications, while at the same time meeting refinery supply commitments and of course making profit. This is done through multi objective optimization. For corporate refining companies and at the national level, Intea-Refinery and Inter-Refinery optimization is the second step in optimizing the operation of the whole refining chain as a single system. Most refinery-wide optimization methods do not cover multiple objectives such as minimizing environmental impact, avoiding catastrophic failures, or enhancing product spec upgrade effects. This work starts by carrying out a refinery-wide, single objective optimization, and then moves to multi objective-single refinery optimization. The last step is multi objective-multi refinery optimization, the objectives of which are analysis of the effects of economic, environmental, product spec, strategic, and catastrophic failure. Simulation runs were carried out using both MATLAB and ASPEN PIMS utilizing nonlinear techniques to solve the optimization problem. The results addressed the need to debottleneck some refineries or transportation media in order to meet the demand for essential products under partial or total failure scenarios. They also addressed how importing some high spec products can help recover some of the losses and what is needed in order to accomplish this. In addition, the results showed nonlinear relations among local and global objectives for some refineries. The results demonstrate that refineries can have a local multi objective optimum that does not

  6. A Bayesian method to mine spatial data sets to evaluate the vulnerability of human beings to catastrophic risk.

    Science.gov (United States)

    Li, Lianfa; Wang, Jinfeng; Leung, Hareton; Zhao, Sisi

    2012-06-01

    Vulnerability of human beings exposed to a catastrophic disaster is affected by multiple factors that include hazard intensity, environment, and individual characteristics. The traditional approach to vulnerability assessment, based on the aggregate-area method and unsupervised learning, cannot incorporate spatial information; thus, vulnerability can be only roughly assessed. In this article, we propose Bayesian network (BN) and spatial analysis techniques to mine spatial data sets to evaluate the vulnerability of human beings. In our approach, spatial analysis is leveraged to preprocess the data; for example, kernel density analysis (KDA) and accumulative road cost surface modeling (ARCSM) are employed to quantify the influence of geofeatures on vulnerability and relate such influence to spatial distance. The knowledge- and data-based BN provides a consistent platform to integrate a variety of factors, including those extracted by KDA and ARCSM to model vulnerability uncertainty. We also consider the model's uncertainty and use the Bayesian model average and Occam's Window to average the multiple models obtained by our approach to robust prediction of the risk and vulnerability. We compare our approach with other probabilistic models in the case study of seismic risk and conclude that our approach is a good means to mining spatial data sets for evaluating vulnerability. © 2012 Society for Risk Analysis.

  7. Small break LOCA RELAP5/MOD3 uncertainty quantification: Bias and uncertainty evaluation for important phenomena

    International Nuclear Information System (INIS)

    Ortiz, M.G.; Ghan, L.S.; Vogl, J.

    1991-01-01

    The Nuclear Regulatory Commission (NRC) revised the Emergency Core Cooling System (ECCS) licensing rule to allow the use of Best Estimate (BE) computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability and Uncertainty (CSAU) to evaluate BE code uncertainties. The CSAU methodology was demonstrated with a specific application to a pressurized water reactor (PWR), experiencing a postulated large break loss-of-coolant accident (LBLOCA). The current work is part of an effort to adapt and demonstrate the CSAU methodology to a small break (SB) LOCA in a PWR of B and W design using RELAP5/MOD3 as the simulation tool. The subject of this paper is the Assessment and Ranging of Parameters (Element 2 of the CSAU methodology), which determines the contribution to uncertainty of specific models in the code

  8. The Chernobyl catastrophe consequences in the Republic of Belarus. National report

    International Nuclear Information System (INIS)

    Konoplya, E.F.

    1996-03-01

    The estimation of radioecological, medico-biological, economic and social consequences of the Chernobyl catastrophe has shown that unimaginable damage was incurred on Belarus and its territory became the zone of ecological calamity. The Chernobyl NPP catastrophe has led to the contamination of almost the fourth part of the territory of Belarus where there lived 2,2 million people. The damage caused to the republic by the catastrophe makes up 32 annual budgets of the republic of the pre-accident period in account for the 30-years period for its overcoming. Radioecological situation in Belarus is characterized by complexity and heterogeneous contamination of the territory by different radionuclides and their presence on all the components of the environment. It stipulates the plurality of ways of external and internal irradiation of the population and jeopardizes its health. There is registered the worsening of the population's health, of evacuated and inhabiting the contaminated areas as well, with increase of a number of somatic diseases, including oncological diseases, there are disorders in the metabolic processes and functions of the main systems of the organism. The demographic indices are decreasing. Particular concern causes the children's morbidity growth and genetic consequences of the accident. The contamination of agricultural lands has stipulated in the neighboring the Chernobyl NPP zone the impossibility of their use for food production. On the other lands it has been required to re-profile the farms and create new technologies of the agricultural production. There have been revealed the destructive tendencies in all spheres of the life activity of people who experienced radiation effects. The processes of social adaptation and socio-psychological support of the population require considerable optimization. In spite of that for ten years passed after the catastrophe the discrepancy of its estimations has not been overcome completely. At the same time

  9. The role of catastrophic geomorphic events in central Appalachian landscape evolution

    Science.gov (United States)

    Jacobson, R.B.; Miller, A.J.; Smith, J.A.

    1989-01-01

    Catastrophic geomorphic events are taken as those that are large, sudden, and rare on human timescales. In the nonglaciated, low-seismicity central Appalachians, these are dominantly floods and landslides. Evaluation of the role of catastrophic events in landscape evolution includes assessment of their contributions to denudation and formation of prominent landscape features, and how they vary through space and time. Tropical storm paths and topographic barriers at the Blue Ridge and Allegheny Front create significant climatic variability across the Appalachians. For moderate floods, the influence of basin geology is apparent in modifying severity of flooding, but for the most extreme events, flood discharges relate mainly to rainfall characteristics such as intensity, duration, storm size, and location. Landslide susceptibility relates more directly to geologic controls that determine what intensity and duration of rainfall will trigger slope instability. Large floods and landslides are not necessarily effective in producing prominent geomorphic features. Large historic floods in the Piedmont have been minimally effective in producing prominent and persistent geomorphic features. In contrast, smaller floods in the Valley and Ridge produced erosional and depositional features that probably will require thousands of years to efface. Scars and deposits of debris slide-avalanches triggered on sandstone ridges recover slowly and persist much longer than scars and deposits of smaller landslides triggered on finer-grained regolith, even though the smaller landslides may have eroded greater aggregate volume. The surficial stratigraphic record can be used to extend the spatial and temporal limits of our knowledge of catastrophic events. Many prominent alluvial and colluvial landforms in the central Appalachians are composed of sediments that were deposited by processes similar to those observed in historic catastrophic events. Available stratigraphic evidence shows two

  10. Estimation of the uncertainty of a climate model using an ensemble simulation

    Science.gov (United States)

    Barth, A.; Mathiot, P.; Goosse, H.

    2012-04-01

    The atmospheric forcings play an important role in the study of the ocean and sea-ice dynamics of the Southern Ocean. Error in the atmospheric forcings will inevitably result in uncertain model results. The sensitivity of the model results to errors in the atmospheric forcings are studied with ensemble simulations using multivariate perturbations of the atmospheric forcing fields. The numerical ocean model used is the NEMO-LIM in a global configuration with an horizontal resolution of 2°. NCEP reanalyses are used to provide air temperature and wind data to force the ocean model over the last 50 years. A climatological mean is used to prescribe relative humidity, cloud cover and precipitation. In a first step, the model results is compared with OSTIA SST and OSI SAF sea ice concentration of the southern hemisphere. The seasonal behavior of the RMS difference and bias in SST and ice concentration is highlighted as well as the regions with relatively high RMS errors and biases such as the Antarctic Circumpolar Current and near the ice-edge. Ensemble simulations are performed to statistically characterize the model error due to uncertainties in the atmospheric forcings. Such information is a crucial element for future data assimilation experiments. Ensemble simulations are performed with perturbed air temperature and wind forcings. A Fourier decomposition of the NCEP wind vectors and air temperature for 2007 is used to generate ensemble perturbations. The perturbations are scaled such that the resulting ensemble spread matches approximately the RMS differences between the satellite SST and sea ice concentration. The ensemble spread and covariance are analyzed for the minimum and maximum sea ice extent. It is shown that errors in the atmospheric forcings can extend to several hundred meters in depth near the Antarctic Circumpolar Current.

  11. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  12. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  13. Managing Model Data Introduced Uncertainties in Simulator Predictions for Generation IV Systems via Optimum Experimental Design

    Energy Technology Data Exchange (ETDEWEB)

    Turinsky, Paul J [North Carolina State Univ., Raleigh, NC (United States); Abdel-Khalik, Hany S [North Carolina State Univ., Raleigh, NC (United States); Stover, Tracy E [North Carolina State Univ., Raleigh, NC (United States)

    2011-03-01

    to the design concept is quantitatively determined. A technique is then established to assimilate this data and produce posteriori uncertainties on key attributes and responses of the design concept. Several experiment perturbations based on engineering judgment are used to demonstrate these methods and also serve as an initial generation of the optimization problem. Finally, an optimization technique is developed which will simultaneously arrive at an optimized experiment to produce an optimized reactor design. Solution of this problem is made possible by the use of the simulated annealing algorithm for solution of optimization problems. The optimization examined in this work is based on maximizing the reactor cost savings associated with the modified design made possible by using the design margin gained through reduced basic nuclear data uncertainties. Cost values for experiment design specifications and reactor design specifications are established and used to compute a total savings by comparing the posteriori reactor cost to the a priori cost plus the cost of the experiment. The optimized solution arrives at a maximized cost savings.

  14. Applying Catastrophe Theory to an Information-Processing Model of Problem Solving in Science Education

    Science.gov (United States)

    Stamovlasis, Dimitrios; Tsaparlis, Georgios

    2012-01-01

    In this study, we test an information-processing model (IPM) of problem solving in science education, namely the working memory overload model, by applying catastrophe theory. Changes in students' achievement were modeled as discontinuities within a cusp catastrophe model, where working memory capacity was implemented as asymmetry and the degree…

  15. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  16. Geologic storage of carbon dioxide and enhanced oil recovery. I. Uncertainty quantification employing a streamline based proxy for reservoir flow simulation

    International Nuclear Information System (INIS)

    Kovscek, A.R.; Wang, Y.

    2005-01-01

    Carbon dioxide (CO 2 ) is already injected into a limited class of reservoirs for oil recovery purposes; however, the engineering design question for simultaneous oil recovery and storage of anthropogenic CO 2 is significantly different from that of oil recovery alone. Currently, the volumes of CO 2 injected solely for oil recovery are minimized due to the purchase cost of CO 2 . If and when CO 2 emissions to the atmosphere are managed, it will be necessary to maximize simultaneously both economic oil recovery and the volumes of CO 2 emplaced in oil reservoirs. This process is coined 'cooptimization'. This paper proposes a work flow for cooptimization of oil recovery and geologic CO 2 storage. An important component of the work flow is the assessment of uncertainty in predictions of performance. Typical methods for quantifying uncertainty employ exhaustive flow simulation of multiple stochastic realizations of the geologic architecture of a reservoir. Such approaches are computationally intensive and thereby time consuming. An analytic streamline based proxy for full reservoir simulation is proposed and tested. Streamline trajectories represent the three-dimensional velocity field during multiphase flow in porous media and so are useful for quantifying the similarity and differences among various reservoir models. The proxy allows rational selection of a representative subset of equi-probable reservoir models that encompass uncertainty with respect to true reservoir geology. The streamline approach is demonstrated to be thorough and rapid

  17. A model of pathways to artificial superintelligence catastrophe for risk and decision analysis

    Science.gov (United States)

    Barrett, Anthony M.; Baum, Seth D.

    2017-03-01

    An artificial superintelligence (ASI) is an artificial intelligence that is significantly more intelligent than humans in all respects. Whilst ASI does not currently exist, some scholars propose that it could be created sometime in the future, and furthermore that its creation could cause a severe global catastrophe, possibly even resulting in human extinction. Given the high stakes, it is important to analyze ASI risk and factor the risk into decisions related to ASI research and development. This paper presents a graphical model of major pathways to ASI catastrophe, focusing on ASI created via recursive self-improvement. The model uses the established risk and decision analysis modelling paradigms of fault trees and influence diagrams in order to depict combinations of events and conditions that could lead to AI catastrophe, as well as intervention options that could decrease risks. The events and conditions include select aspects of the ASI itself as well as the human process of ASI research, development and management. Model structure is derived from published literature on ASI risk. The model offers a foundation for rigorous quantitative evaluation and decision-making on the long-term risk of ASI catastrophe.

  18. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    Science.gov (United States)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data

  19. Catastrophic costs potentially averted by tuberculosis control in India and South Africa: a modelling study

    NARCIS (Netherlands)

    Verguet, Stéphane; Riumallo-Herl, Carlos; Gomez, Gabriela B.; Menzies, Nicolas A.; Houben, Rein M. G. J.; Sumner, Tom; Lalli, Marek; White, Richard G.; Salomon, Joshua A.; Cohen, Ted; Foster, Nicola; Chatterjee, Susmita; Sweeney, Sedona; Baena, Inés Garcia; Lönnroth, Knut; Weil, Diana E.; Vassall, Anna

    2017-01-01

    The economic burden on households affected by tuberculosis through costs to patients can be catastrophic. WHO's End TB Strategy recognises and aims to eliminate these potentially devastating economic effects. We assessed whether aggressive expansion of tuberculosis services might reduce catastrophic

  20. Factors affecting catastrophic health expenditure and impoverishment from medical expenses in China: policy implications of universal health insurance.

    Science.gov (United States)

    Li, Ye; Wu, Qunhong; Xu, Ling; Legge, David; Hao, Yanhua; Gao, Lijun; Ning, Ning; Wan, Gang

    2012-09-01

    To assess the degree to which the Chinese people are protected from catastrophic household expenditure and impoverishment from medical expenses and to explore the health system and structural factors influencing the first of these outcomes. Data were derived from the Fourth National Health Service Survey. An analysis of catastrophic health expenditure and impoverishment from medical expenses was undertaken with a sample of 55 556 households of different characteristics and located in rural and urban settings in different parts of the country. Logistic regression was used to identify the determinants of catastrophic health expenditure. The rate of catastrophic health expenditure was 13.0%; that of impoverishment was 7.5%. Rates of catastrophic health expenditure were higher among households having members who were hospitalized, elderly, or chronically ill, as well as in households in rural or poorer regions. A combination of adverse factors increased the risk of catastrophic health expenditure. Families enrolled in the urban employee or resident insurance schemes had lower rates of catastrophic health expenditure than those enrolled in the new rural corporative scheme. The need for and use of health care, demographics, type of benefit package and type of provider payment method were the determinants of catastrophic health expenditure. Although China has greatly expanded health insurance coverage, financial protection remains insufficient. Policy-makers should focus on designing improved insurance plans by expanding the benefit package, redesigning cost sharing arrangements and provider payment methods and developing more effective expenditure control strategies.

  1. Catastrophic out-of-pocket payments for health and poverty nexus: evidence from Senegal.

    Science.gov (United States)

    Séne, Ligane Massamba; Cissé, Momath

    2015-09-01

    Out-of-pocket payments are the primary source through which health expenditure is met in Senegal. However, these payments are financial burdens that lead to impoverishment when they become catastrophic. The purpose of this study is to cast light on the determinants of catastrophic household out-of-pocket health expenditures and to assess their implications on poverty. The 2011 poverty monitoring survey is used in this study. This survey aims to draw poverty profiles and to highlight the socio-economic characteristics of different social groups. In line with the concerns raised by the new Supplemental Poverty Measure, poverty statistics are adjusted to take into account household health expenditures and to estimate their impoverishing effects. To identify the determinants of the magnitude of catastrophic health expenditure, we implement a seemingly unrelated equations system of Tobit regressions to take into account censoring through a conditional mixed-process estimator procedure. We identify major causes of catastrophic expenditures, such as the level of overall health spending, the expensiveness of health goods and services, the characteristics of health facilities, the health stock shocks, the lack of insurance, etc. Results show evidence that catastrophic health expenditures jeopardize household welfare for some people that fall into poverty as a result of negative effects on disposable income and disruption of the material living standards of households. Our findings warrant further policy improvements to minimize the financial risks of out-of-pocket health expenditures and increase the efficiency of health care system for more effective poverty reduction strategies.

  2. UNCERTAINTIES IN GALACTIC CHEMICAL EVOLUTION MODELS

    International Nuclear Information System (INIS)

    Côté, Benoit; Ritter, Christian; Herwig, Falk; O’Shea, Brian W.; Pignatari, Marco; Jones, Samuel; Fryer, Chris L.

    2016-01-01

    We use a simple one-zone galactic chemical evolution model to quantify the uncertainties generated by the input parameters in numerical predictions for a galaxy with properties similar to those of the Milky Way. We compiled several studies from the literature to gather the current constraints for our simulations regarding the typical value and uncertainty of the following seven basic parameters: the lower and upper mass limits of the stellar initial mass function (IMF), the slope of the high-mass end of the stellar IMF, the slope of the delay-time distribution function of Type Ia supernovae (SNe Ia), the number of SNe Ia per M ⊙ formed, the total stellar mass formed, and the final mass of gas. We derived a probability distribution function to express the range of likely values for every parameter, which were then included in a Monte Carlo code to run several hundred simulations with randomly selected input parameters. This approach enables us to analyze the predicted chemical evolution of 16 elements in a statistical manner by identifying the most probable solutions, along with their 68% and 95% confidence levels. Our results show that the overall uncertainties are shaped by several input parameters that individually contribute at different metallicities, and thus at different galactic ages. The level of uncertainty then depends on the metallicity and is different from one element to another. Among the seven input parameters considered in this work, the slope of the IMF and the number of SNe Ia are currently the two main sources of uncertainty. The thicknesses of the uncertainty bands bounded by the 68% and 95% confidence levels are generally within 0.3 and 0.6 dex, respectively. When looking at the evolution of individual elements as a function of galactic age instead of metallicity, those same thicknesses range from 0.1 to 0.6 dex for the 68% confidence levels and from 0.3 to 1.0 dex for the 95% confidence levels. The uncertainty in our chemical evolution model

  3. LDRD Final Report: Capabilities for Uncertainty in Predictive Science.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric Todd; Eldred, Michael S; Salinger, Andrew G.; Webster, Clayton G.

    2008-10-01

    Predictive simulation of systems comprised of numerous interconnected, tightly coupled com-ponents promises to help solve many problems of scientific and national interest. Howeverpredictive simulation of such systems is extremely challenging due to the coupling of adiverse set of physical and biological length and time scales. This report investigates un-certainty quantification methods for such systems that attempt to exploit their structure togain computational efficiency. The traditional layering of uncertainty quantification aroundnonlinear solution processes is inverted to allow for heterogeneous uncertainty quantificationmethods to be applied to each component in a coupled system. Moreover this approachallows stochastic dimension reduction techniques to be applied at each coupling interface.The mathematical feasibility of these ideas is investigated in this report, and mathematicalformulations for the resulting stochastically coupled nonlinear systems are developed.3

  4. Household catastrophic healthcare expenditure and impoverishment due to rotavirus gastroenteritis requiring hospitalization in Malaysia.

    Directory of Open Access Journals (Sweden)

    Tharani Loganathan

    Full Text Available While healthcare costs for rotavirus gastroenteritis requiring hospitalization may be burdensome on households in Malaysia, exploration on the distribution and catastrophic impact of these expenses on households are lacking.We assessed the economic burden, levels and distribution of catastrophic healthcare expenditure, the poverty impact on households and inequities related to healthcare payments for acute gastroenteritis requiring hospitalization in Malaysia.A two-year prospective, hospital-based study was conducted from 2008 to 2010 in an urban (Kuala Lumpur and rural (Kuala Terengganu setting in Malaysia. All children under the age of 5 years admitted for acute gastroenteritis were included. Patients were screened for rotavirus and information on healthcare expenditure was obtained.Of the 658 stool samples collected at both centers, 248 (38% were positive for rotavirus. Direct and indirect costs incurred were significantly higher in Kuala Lumpur compared with Kuala Terengganu (US$222 Vs. US$45; p<0.001. The mean direct and indirect costs for rotavirus gastroenteritis consisted 20% of monthly household income in Kuala Lumpur, as compared with only 5% in Kuala Terengganu. Direct medical costs paid out-of-pocket caused 141 (33% households in Kuala Lumpur to experience catastrophic expenditure and 11 (3% households to incur poverty. However in Kuala Terengganu, only one household (0.5% experienced catastrophic healthcare expenditure and none were impoverished. The lowest income quintile in Kuala Lumpur was more likely to experience catastrophic payments compared to the highest quintile (87% vs 8%. The concentration index for out-of-pocket healthcare payments was closer to zero at Kuala Lumpur (0.03 than at Kuala Terengganu (0.24.While urban households were wealthier, healthcare expenditure due to gastroenteritis had more catastrophic and poverty impact on the urban poor. Universal rotavirus vaccination would reduce both disease burden and health

  5. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    Science.gov (United States)

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  6. Uncertainty vs. learning in climate policy: Some classical results and new directions

    Energy Technology Data Exchange (ETDEWEB)

    Lange, A. [Univ. of Maryland (United States); Treich, N. [Univ. of Toulouse (France)

    2007-07-01

    Climate policy decisions today have to be made under substantial uncertainty: the impact of accumulating greenhouse gases in the atmosphere is not perfectly known, the future economic and social consequences of climate change, in particular the valuation of possible damages, are uncertain. However, learning will change the basis of making future decisions on abatement policies. These important issues of uncertainty and learning are often presented in a colloquial sense. Two opposing effects are typically put forward: First, uncertainty about future climate damage, which is often associated with the possibility of a catastrophic scenario is said to give a premium to slow down global warming and therefore to increase abatement efforts today. Second learning opportunities will reduce scientific undertainty about climate damage over time. This is often used as an argument to postpone abatement efforts until new information is received. The effects of uncertainty and learning on the optimal design of current climate policy are still much debated both in the academic and the political arena. In this paper, the authors study and contrast the effect of uncertainty and learning in a two-decision model that encompasses most existing microeconomics models of climate change. They first consider the common expected utility framework: While uncertainty has generally no or a negative effect on welfare, learning has always a positive, and thus opposite, effect. The effects of both uncertainty and learning on decisions are less clear. Neither uncertainty nor learning can be used as an argument to increase or reduce emissions today, independently on the degree of risk aversion of the decision-marker and on the nature of irreversibility constraints. The authors then deviate from the expected utility framework and consider a model with ambiguity aversion. The model accounts well for situations of imprecise or multiple probability distributions, as present in the context of climate

  7. Development of the Catastrophe Bonds and their correlation with other financial instruments

    OpenAIRE

    Čavojec, Ján

    2009-01-01

    This master thesis discusses the niche of reinsurance business -- catastrophe bonds. It provides a brief description of reinsurance in general, insurance-linked securities and catastrophe bonds. The goal of this thesis is to describe the development of cat bond market and the influence of economic and natural shocks on it. In order to analyze the effect, quarter issuance data are used together with Swiss Re Cat Bonds return indexes. In addition, several other variables (i.e. Munich Re and Swi...

  8. Piezoelectric energy harvesting with parametric uncertainty

    International Nuclear Information System (INIS)

    Ali, S F; Friswell, M I; Adhikari, S

    2010-01-01

    The design and analysis of energy harvesting devices is becoming increasing important in recent years. Most of the literature has focused on the deterministic analysis of these systems and the problem of uncertain parameters has received less attention. Energy harvesting devices exhibit parametric uncertainty due to errors in measurement, errors in modelling and variability in the parameters during manufacture. This paper investigates the effect of parametric uncertainty in the mechanical system on the harvested power, and derives approximate explicit formulae for the optimal electrical parameters that maximize the mean harvested power. The maximum of the mean harvested power decreases with increasing uncertainty, and the optimal frequency at which the maximum mean power occurs shifts. The effect of the parameter variance on the optimal electrical time constant and optimal coupling coefficient are reported. Monte Carlo based simulation results are used to further analyse the system under parametric uncertainty

  9. Networks of Zeeman catastrophe machines for the investigation of complex systems

    International Nuclear Information System (INIS)

    Nagy, Péter; Tasnádi, Péter

    2014-01-01

    The investigation of chaotic motion and cooperative systems presents a great opportunity to involve modern physics into the basic course of mechanics taught to BSc-level students. In our previous paper (2014 Eur. J. Phys. 35 015018), it was demonstrated that a Zeeman machine can be a versatile and motivating tool for students to gain introductory knowledge about chaotic motion via interactive simulations. Although the Zeeman machine is known mainly for its quasi-static and catastrophic behaviour, its dynamic properties are also very interesting and show typical chaotic features. In this paper, we present a novel construction consisting of Zeeman machines linked into a network. Although Zeeman networks can be built with almost arbitrary topology, our discussion is restricted to a system where Zeeman machines are arranged in a two-dimensional periodical lattice and the angular variables of the machines are limited to discrete values only. It will be shown that the Zeeman-crystal is appropriate for studying the properties of a broad range of complex systems. Using NetLogo simulations (second- and first-order) phase transitions, its ferromagnetic- and anti-ferromagnetic-type behaviour is demonstrated. A limiting case of the theoretical model of Zeeman-crystal leads to a model that is analogous to the Potts clock model used frequently in statistical physics. The present paper is organically linked to our website (http://csodafizika.hu/zeeman) where downloadable simulations, which are discussed in the paper, can be found. (paper)

  10. GLACIER DEGRADATION AND CATASTROPHIC MUDFLOWS ORIGIN FROM THE MODERN GLACIAL-MORAINE BODIES IN THE ELBRUS REGION

    Directory of Open Access Journals (Sweden)

    E. A. Zolotarev

    2012-01-01

    Full Text Available Mechanism of formation of the catastrophic mudflows in different glacial valleys of Elbrus region at the present stage of glacial degradation is described. The important role of the buried ice in the formation of catastrophic mudflows that affected Tyrnyauz in the XX century was revealed as a result of remote monitoring of changes in glacial-moraine complex of Kayarta river. The dynamics of glacial lakes in the Adyl-Su valley in the Bashkara Glacier region was described and probability of their breakthrough was estimated. The quantitative indicators of the dynamics of the landslide in the Kubasanty valley were obtained as a result of remote monitoring, and its influence on the formation of catastrophic mudflows is discovered. Various possible methods of catastrophic mudflows prevention not requiring expensive protective constructions are discussed.

  11. The application of catastrophe theory to image analysis

    NARCIS (Netherlands)

    Kuijper, A.; Florack, L.M.J.

    2001-01-01

    In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of critical points under the inuence of blurring. We show how the mathematical framework of catastrophe theory can be used to describe the various different types of annihilations and the

  12. Three Solvable Matrix Models of a Quantum Catastrophe

    Czech Academy of Sciences Publication Activity Database

    Levai, G.; Růžička, František; Znojil, Miloslav

    2014-01-01

    Roč. 53, č. 9 (2014), s. 2875-2890 ISSN 0020-7748 Institutional support: RVO:61389005 Keywords : quantum theory * PT symmetry * Finite-dimensional non-Hermitian Hamiltonians * exceptional-point localization * quantum theory of catastrophes * methods of computer algebra Subject RIV: BE - Theoretical Physics Impact factor: 1.184, year: 2014

  13. Uncertainty analysis of flexible rotors considering fuzzy parameters and fuzzy-random parameters

    Directory of Open Access Journals (Sweden)

    Fabian Andres Lara-Molina

    Full Text Available Abstract The components of flexible rotors are subjected to uncertainties. The main sources of uncertainties include the variation of mechanical properties. This contribution aims at analyzing the dynamics of flexible rotors under uncertain parameters modeled as fuzzy and fuzzy random variables. The uncertainty analysis encompasses the modeling of uncertain parameters and the numerical simulation of the corresponding flexible rotor model by using an approach based on fuzzy dynamic analysis. The numerical simulation is accomplished by mapping the fuzzy parameters of the deterministic flexible rotor model. Thereby, the flexible rotor is modeled by using both the Fuzzy Finite Element Method and the Fuzzy Stochastic Finite Element Method. Numerical simulations illustrate the methodology conveyed in terms of orbits and frequency response functions subject to uncertain parameters.

  14. HTGR reactor physics, thermal-hydraulics and depletion uncertainty analysis: a proposed IAEA coordinated research project

    International Nuclear Information System (INIS)

    Tyobeka, Bismark; Reitsma, Frederik; Ivanov, Kostadin

    2011-01-01

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis and uncertainty analysis methods. In order to benefit from recent advances in modeling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Uncertainty and sensitivity studies are an essential component of any significant effort in data and simulation improvement. In February 2009, the Technical Working Group on Gas-Cooled Reactors recommended that the proposed IAEA Coordinated Research Project (CRP) on the HTGR Uncertainty Analysis in Modeling be implemented. In the paper the current status and plan are presented. The CRP will also benefit from interactions with the currently ongoing OECD/NEA Light Water Reactor (LWR) UAM benchmark activity by taking into consideration the peculiarities of HTGR designs and simulation requirements. (author)

  15. Chernobyl: Endless horror. Late effects of the reactor catastrophe

    International Nuclear Information System (INIS)

    Roethlein, B.

    1996-01-01

    Ten years after the accident, the people of Chernobyl are trying to live a normal life, but the problems resulting from the catastrophe have not been solved. Some of them are just starting to emerge. (orig.) [de

  16. The reciprocal relationship between daily fatigue and catastrophizing following cancer treatment: Affect and physical activity as potential mediators.

    Science.gov (United States)

    Müller, Fabiola; Stephenson, Ellen; DeLongis, Anita; Smink, Ans; Van Ginkel, Robert J; Tuinman, Marrit A; Hagedoorn, Mariët

    2018-03-01

    Fatigue is a distressing symptom many cancer patients experience even after completion of treatment. Although theory and empirical evidence indicate that negative cognitions perpetuate fatigue after completion of treatment, insight into how this process unfolds in daily life is limited. This study used an intensive longitudinal design to investigate the reciprocal relationship between catastrophizing and fatigue in daily life and whether affective and behavioral processes mediate these relationships. Post-treatment colorectal cancer patients (n = 101) completed daily diaries (14 days, 3 times daily) regarding their fatigue, catastrophizing, positive and negative affect, and physical activity. Multilevel modeling was applied to investigate within-person associations within days. Analyses revealed a positive reciprocal relationship between fatigue and catastrophizing throughout the day. That is, high levels of catastrophizing were associated with increases in fatigue within patients. In turn, but to a lesser extent, high levels of fatigue predicted increases in catastrophizing at the next assessment. Low positive affect and high negative affect mediated the effect of catastrophizing on increases in fatigue. Only negative affect mediated the reverse relationship. Physical activity did not mediate either relationship. This study provides evidence for a mutually reinforcing relationship between catastrophizing and fatigue in daily life, which might explain the perpetuation of fatigue after completion of cancer treatment. Fatigue-specific cognitive behavior therapy could be improved by educating patients about this daily reciprocal relationship, train them to quickly replace catastrophizing thoughts in daily life, and help them to cope with affective changes induced by fatigue. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Stochastic catastrophe theory and instabilities in plasma turbulence

    International Nuclear Information System (INIS)

    Rajkovic, Milan; Skoric, Milos

    2009-01-01

    Full text: A Langevin equation (LE) describing evolution of turbulence amplitude in plasma is analyzed from the aspect of stochastic catastrophe theory (SCT) so that turbulent plasma is considered as a stochastic gradient system. According to SCT the dynamics of the system is completely determined by the stochastic potential function and the maximum likelihood estimates of stable and unstable equilibria are associated with the modes and anti-modes, respectively, of the system's stationary probability density function. First order phase transitions occur at degenerate equilibrium points and the potential function at these points may be represented in a generic way. Since the diffusion function of plasma LE is not constant the probability density function (pdf) is not a reliable estimator of the number of stable states. We show that the generalized pdf represented as the product of the stationary pdf and the diffusion function is a reliable estimator of the stable states and that it can be evaluated from the zero mean crossing analysis of plasma turbulence signal. Stochastic bifurcations, and particularly the sudden (catastrophic) ones, are recognized from the pdf's obtained by the zero crossing analysis and we illustrate the applications of SCT in plasma turbulence on data obtained from the MAST (Mega Ampere Spherical Tokamak) for low (L), high (H) and unstable dithering (L/H) confinement regimes. The relationship of the transformation invariant zero-crossing function and SCT is shown to provide important information about the nature of edge localized modes (ELMs) and L-H transition. Finally we show that ELMs occur as a result of catastrophic (hard) bifurcations ruling out the self-organized criticality scenario for their origin. (author)

  18. Modeling Nonlinear Site Response Uncertainty in Broadband Ground Motion Simulations for the Los Angeles Basin

    Science.gov (United States)

    Assimaki, D.; Li, W.; Steidl, J. M.; Schmedes, J.

    2007-12-01

    The assessment of strong motion site response is of great significance, both for mitigating seismic hazard and for performing detailed analyses of earthquake source characteristics. There currently exists, however, large degree of uncertainty concerning the mathematical model to be employed for the computationally efficient evaluation of local site effects, and the site investigation program necessary to evaluate the nonlinear input model parameters and ensure cost-effective predictions; and while site response observations may provide critical constraints on interpretation methods, the lack of a statistically significant number of in-situ strong motion records prohibits statistical analyses to be conducted and uncertainties to be quantified based entirely on field data. In this paper, we combine downhole observations and broadband ground motion synthetics for characteristic site conditions the Los Angeles Basin, and investigate the variability in ground motion estimation introduced by the site response assessment methodology. In particular, site-specific regional velocity and attenuation structures are initially compiled using near-surface geotechnical data collected at downhole geotechnical arrays, inverse low-strain velocity and attenuation profiles at these sites obtained by inversion of weak motion records and the crustal velocity structure at the corresponding locations obtained from the Southern California Earthquake Centre Community Velocity Model. Successively, broadband ground motions are simulated by means of a hybrid low/high-frequency finite source model with correlated random parameters for rupture scenaria of weak, medium and large magnitude events (M =3.5-7.5). Observed estimates of site response at the stations of interest are first compared to the ensemble of approximate and incremental nonlinear site response models. Parametric studies are next conducted for each fixed magnitude (fault geometry) scenario by varying the source-to-site distance and

  19. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  20. Uncertainty analysis for effluent trading planning using a Bayesian estimation-based simulation-optimization modeling approach.

    Science.gov (United States)

    Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J

    2017-06-01

    In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision

  1. Catastrophe Theory: A Unified Model for Educational Change.

    Science.gov (United States)

    Cryer, Patricia; Elton, Lewis

    1990-01-01

    Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)

  2. Metallicity at interphase boundaries due to polar catastrophe induced by charge density discontinuity

    KAUST Repository

    Albar, Arwa

    2018-02-09

    The electronic properties of interphase boundaries are of basic importance for most materials, particularly when those properties deviate strongly from the bulk behavior. We introduce a mechanism that can result in metallicity at stoichiometric interphase boundaries between semiconductors based on the idea of polar catastrophe, which is usually considered only in the context of heterostructures. To this end, we perform ab initio calculations within density functional theory to investigate the electronic states at stoichiometric SnO/SnO2 (110) interphase boundaries. In this system, one would not expect polar catastrophe to have a role according to state-of-the-art theory because the interface lacks formal charge discontinuity. However, we observe the formation of a hole gas between the semiconductors SnO and SnO2. To explain these findings, we provide a generalized theory based on the idea that the charge density discontinuity between SnO and SnO2, a consequence of lattice mismatch, drives a polar catastrophe scenario. As a result, SnO/SnO2 (110) interphase boundaries can develop metallicity depending on the grain size. The concept of metallicity due to polar catastrophe induced by charge density discontinuity is of general validity and applies to many interphase boundaries with lattice mismatch.

  3. Simulating range-wide population and breeding habitat dynamics for an endangered woodland warbler in the face of uncertainty

    Science.gov (United States)

    Adam Duarte,; Hatfield, Jeffrey; Todd M. Swannack,; Michael R. J. Forstner,; M. Clay Green,; Floyd W. Weckerly,

    2015-01-01

    Population viability analyses provide a quantitative approach that seeks to predict the possible future status of a species of interest under different scenarios and, therefore, can be important components of large-scale species’ conservation programs. We created a model and simulated range-wide population and breeding habitat dynamics for an endangered woodland warbler, the golden-cheeked warbler (Setophaga chrysoparia). Habitat-transition probabilities were estimated across the warbler's breeding range by combining National Land Cover Database imagery with multistate modeling. Using these estimates, along with recently published demographic estimates, we examined if the species can remain viable into the future given the current conditions. Lastly, we evaluated if protecting a greater amount of habitat would increase the number of warblers that can be supported in the future by systematically increasing the amount of protected habitat and comparing the estimated terminal carrying capacity at the end of 50 years of simulated habitat change. The estimated habitat-transition probabilities supported the hypothesis that habitat transitions are unidirectional, whereby habitat is more likely to diminish than regenerate. The model results indicated population viability could be achieved under current conditions, depending on dispersal. However, there is considerable uncertainty associated with the population projections due to parametric uncertainty. Model results suggested that increasing the amount of protected lands would have a substantial impact on terminal carrying capacities at the end of a 50-year simulation. Notably, this study identifies the need for collecting the data required to estimate demographic parameters in relation to changes in habitat metrics and population density in multiple regions, and highlights the importance of establishing a common definition of what constitutes protected habitat, what management goals are suitable within those protected

  4. Approach to uncertainty evaluation for safety analysis

    International Nuclear Information System (INIS)

    Ogura, Katsunori

    2005-01-01

    Nuclear power plant safety used to be verified and confirmed through accident simulations using computer codes generally because it is very difficult to perform integrated experiments or tests for the verification and validation of the plant safety due to radioactive consequence, cost, and scaling to the actual plant. Traditionally the plant safety had been secured owing to the sufficient safety margin through the conservative assumptions and models to be applied to those simulations. Meanwhile the best-estimate analysis based on the realistic assumptions and models in support of the accumulated insights could be performed recently, inducing the reduction of safety margin in the analysis results and the increase of necessity to evaluate the reliability or uncertainty of the analysis results. This paper introduces an approach to evaluate the uncertainty of accident simulation and its results. (Note: This research had been done not in the Japan Nuclear Energy Safety Organization but in the Tokyo Institute of Technology.) (author)

  5. Uncertainty analysis of energy consumption in dwellings

    Energy Technology Data Exchange (ETDEWEB)

    Pettersen, Trine Dyrstad

    1997-12-31

    This thesis presents a comprehensive study of an energy estimation model that can be used to examine the uncertainty of predicted energy consumption in a dwelling. The variation and uncertainty of input parameters due to the outdoor climate, the building construction and the inhabitants are studied as a basis for further energy evaluations. The occurring variations of energy consumption in nominal similar dwellings are also investigated due to verification of the simulated energy consumption. The main topics are (1) a study of expected variations and uncertainties in both input parameters used in energy consumption calculations and the energy consumption in the dwelling, (2) the development and evaluation of a simplified energy calculation model that considers uncertainties due to the input parameters, (3) an evaluation of the influence of the uncertain parameters on the total variation so that the most important parameters can be identified, and (4) the recommendation of a simplified procedure for treating uncertainties or possible deviations from average conditions. 90 refs., 182 figs., 73 tabs.

  6. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Science.gov (United States)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  7. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Energy Technology Data Exchange (ETDEWEB)

    Huan, Xun [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Geraci, Gianluca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vane, Zachary P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Lacaze, Guilhem [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Oefelein, Joseph C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  8. Characterization uncertainty and its effects on models and performance

    International Nuclear Information System (INIS)

    Rautman, C.A.; Treadway, A.H.

    1991-01-01

    Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization

  9. Nuclear catastrophe in Japan. Health consequences resulting from Fukushima

    International Nuclear Information System (INIS)

    Paulitz, Henrik; Eisenberg, Winfrid; Thiel, Reinhold

    2013-01-01

    external radiation exposure would amount to between 37,899 and 82,606 cases, while 37,266 cancer cases would result from the intake of contaminated food. With respect to the workers, who, according to the Fukushima operating company Tepco, were on duty in the damaged plant in 2011, IPPNW estimates on the basis of Chernobyl experiences that more than 17,000 of them will develop serious diseases. A few of the quantitative results of this study are subject to uncertainty, because some of the original data has only been published in an imprecise form and certain calculations involved making further assumptions necessary. Nevertheless, IPPNW has deemed it necessary to present this quantitative estimate in order to show clearly the true dimension of the Fukushima nuclear catastrophe. At present, there are numerous nuclear power plants operating at sites facing the potential risk of an earthquake all over the world. Many of them are much less able to withstand the force of an earthquake than nuclear reactors in Japan. Even a relatively weak earthquake could, at any time, trigger another nuclear catastrophe almost anywhere, in Asia, America, and also in Europe.

  10. A CATASTROPHIC-CUM-RESTORATIVE QUEUING SYSTEM WITH CORRELATED BATCH ARRIVALS AND VARIABLE CAPACITY

    Directory of Open Access Journals (Sweden)

    Rakesh Kumar

    2008-07-01

    Full Text Available In this paper, we study a catastrophic-cum-restorative queuing system with correlated batch arrivals and service in batches of variable sizes. We perform the transient analysis of the queuing model. We obtain the Laplace Transform of the probability generating function of system size. Finally, some particular cases of the model have been derived and discussed. Keywords: Queue length, Catastrophes, Correlated batch arrivals, Broadband services, Variable service capacity, and Restoration.

  11. Resolving uncertainty in chemical speciation determinations

    Science.gov (United States)

    Smith, D. Scott; Adams, Nicholas W. H.; Kramer, James R.

    1999-10-01

    Speciation determinations involve uncertainty in system definition and experimentation. Identification of appropriate metals and ligands from basic chemical principles, analytical window considerations, types of species and checking for consistency in equilibrium calculations are considered in system definition uncertainty. A systematic approach to system definition limits uncertainty in speciation investigations. Experimental uncertainty is discussed with an example of proton interactions with Suwannee River fulvic acid (SRFA). A Monte Carlo approach was used to estimate uncertainty in experimental data, resulting from the propagation of uncertainties in electrode calibration parameters and experimental data points. Monte Carlo simulations revealed large uncertainties present at high (>9-10) and low (monoprotic ligands. Least-squares fit the data with 21 sites, whereas linear programming fit the data equally well with 9 sites. Multiresponse fitting, involving simultaneous fluorescence and pH measurements, improved model discrimination. Deconvolution of the excitation versus emission fluorescence surface for SRFA establishes a minimum of five sites. Diprotic sites are also required for the five fluorescent sites, and one non-fluorescent monoprotic site was added to accommodate the pH data. Consistent with greater complexity, the multiresponse method had broader confidence limits than the uniresponse methods, but corresponded better with the accepted total carboxylic content for SRFA. Overall there was a 40% standard deviation in total carboxylic content for the multiresponse fitting, versus 10% and 1% for least-squares and linear programming, respectively.

  12. A probabilistic strategy for parametric catastrophe insurance

    Science.gov (United States)

    Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin

    2017-04-01

    Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss

  13. Comparison between conservative perturbation and sampling based methods for propagation of Non-Neutronic uncertainties

    International Nuclear Information System (INIS)

    Campolina, Daniel de A.M.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2013-01-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using sampling based method is recent because of the huge computational effort required. In this work a sample space of MCNP calculations were used as a black box model to propagate the uncertainty of system parameters. The efficiency of the method was compared to a conservative method. Uncertainties in input parameters of the reactor considered non-neutronic uncertainties, including geometry dimensions and density. The effect of the uncertainties on the effective multiplication factor of the system was analyzed respect to the possibility of using many uncertainties in the same input. If the case includes more than 46 parameters with uncertainty in the same input, the sampling based method is proved to be more efficient than the conservative method. (author)

  14. Classical and quantum fold catastrophe in the presence of axial symmetry

    Science.gov (United States)

    Dhont, G.; Zhilinskií, B. I.

    2008-11-01

    We introduce a family of Hamiltonians with two degrees of freedom, axial symmetry and complete integrability. The potential function depends on coordinates and one control parameter. A fold catastrophe typically occurs in such a family of potentials and its consequences on the global dynamics are investigated through the energy-momentum map which defines the singular fibration of the four-dimensional phase space. The two inequivalent local canonical forms of the catastrophe are presented: the first case corresponds to the appearance of a second sheet in the image of the energy-momentum map while the second case is associated with the breaking of an already existing second sheet. A special effort is placed on the description of the singularities. In particular, the existence of cuspidal tori is related to a second-order contact point between the energy level set and the reduced phase space. The quantum mechanical aspects of the changes induced by the fold catastrophe are investigated with the quantum eigenstates computed for an octic potential and are interpreted through the quantum-classical correspondence. We note that the singularity exposed in this paper is not an obstruction to a global definition of action-angle variables.

  15. Uncertainty Quantification of Composite Laminate Damage with the Generalized Information Theory

    Energy Technology Data Exchange (ETDEWEB)

    J. Lucero; F. Hemez; T. Ross; K.Kline; J.Hundhausen; T. Tippetts

    2006-05-01

    This work presents a survey of five theories to assess the uncertainty of projectile impact induced damage on multi-layered carbon-epoxy composite plates. Because the types of uncertainty dealt with in this application are multiple (variability, ambiguity, and conflict) and because the data sets collected are sparse, characterizing the amount of delamination damage with probability theory alone is possible but incomplete. This motivates the exploration of methods contained within a broad Generalized Information Theory (GIT) that rely on less restrictive assumptions than probability theory. Probability, fuzzy sets, possibility, and imprecise probability (probability boxes (p-boxes) and Dempster-Shafer) are used to assess the uncertainty in composite plate damage. Furthermore, this work highlights the usefulness of each theory. The purpose of the study is not to compare directly the different GIT methods but to show that they can be deployed on a practical application and to compare the assumptions upon which these theories are based. The data sets consist of experimental measurements and finite element predictions of the amount of delamination and fiber splitting damage as multilayered composite plates are impacted by a projectile at various velocities. The physical experiments consist of using a gas gun to impact suspended plates with a projectile accelerated to prescribed velocities, then, taking ultrasound images of the resulting delamination. The nonlinear, multiple length-scale numerical simulations couple local crack propagation implemented through cohesive zone modeling to global stress-displacement finite element analysis. The assessment of damage uncertainty is performed in three steps by, first, considering the test data only; then, considering the simulation data only; finally, performing an assessment of total uncertainty where test and simulation data sets are combined. This study leads to practical recommendations for reducing the uncertainty and

  16. Modeling multibody systems with uncertainties. Part II: Numerical applications

    Energy Technology Data Exchange (ETDEWEB)

    Sandu, Corina, E-mail: csandu@vt.edu; Sandu, Adrian; Ahmadian, Mehdi [Virginia Polytechnic Institute and State University, Mechanical Engineering Department (United States)

    2006-04-15

    This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties.

  17. Modeling multibody systems with uncertainties. Part II: Numerical applications

    International Nuclear Information System (INIS)

    Sandu, Corina; Sandu, Adrian; Ahmadian, Mehdi

    2006-01-01

    This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties

  18. Co-delivery of paclitaxel and cetuximab by nanodiamond enhances mitotic catastrophe and tumor inhibition.

    Science.gov (United States)

    Lin, Yu-Wei; Raj, Emmanuel Naveen; Liao, Wei-Siang; Lin, Johnson; Liu, Kuang-Kai; Chen, Ting-Hua; Cheng, Hsiao-Chun; Wang, Chi-Ching; Li, Lily Yi; Chen, Chinpiao; Chao, Jui-I

    2017-08-29

    The poor intracellular uptake and non-specific binding of anticancer drugs into cancer cells are the bottlenecks in cancer therapy. Nanocarrier platforms provide the opportunities to improve the drug efficacy. Here we show a carbon-based nanomaterial nanodiamond (ND) that carried paclitaxel (PTX), a microtubule inhibitor, and cetuximab (Cet), a specific monoclonal antibody against epidermal growth factor receptor (EGFR), inducing mitotic catastrophe and tumor inhibition in human colorectal cancer (CRC). ND-PTX blocked the mitotic progression, chromosomal separation, and induced apoptosis in the CRC cells; however, NDs did not induce these effects. Conjugation of ND-PTX with Cet (ND-PTX-Cet) was specifically binding to the EGFR-positive CRC cells and enhanced the mitotic catastrophe and apoptosis induction. Besides, ND-PTX-Cet markedly decreased tumor size in the xenograft EGFR-expressed human CRC tumors of nude mice. Moreover, ND-PTX-Cet induced the mitotic marker protein phospho-histone 3 (Ser10) and apoptotic protein active-caspase 3 for mitotic catastrophe and apoptosis. Taken together, this study demonstrated that the co-delivery of PTX and Cet by ND enhanced the effects of mitotic catastrophe and apoptosis in vitro and in vivo, which may be applied in the human CRC therapy.

  19. Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Biros, George [Univ. of Texas, Austin, TX (United States)

    2018-01-12

    Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. These include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a

  20. Uncertainty of simulated groundwater levels arising from stochastic transient climate change scenarios

    Science.gov (United States)

    Goderniaux, Pascal; Brouyère, Serge; Blenkinsop, Stephen; Burton, Aidan; Fowler, Hayley; Dassargues, Alain

    2010-05-01

    applied not only to the mean of climatic variables, but also across the statistical distributions of these variables. This is important as these distributions are expected to change in the future, with more extreme rainfall events, separated by longer dry periods. (2) The novel approach used in this study can simulate transient climate change from 2010 to 2085, rather than time series representative of a stationary climate for the period 2071-2100. (3) The weather generator is used to generate a large number of equiprobable climate change scenarios for each RCM, representative of the natural variability of the weather. All of these scenarios are applied as input to the Geer basin model to assess the projected impact of climate change on groundwater levels, the uncertainty arising for different RCM projections and the uncertainty linked to natural climatic variability. Using the output results from all scenarios, 95% confidence intervals are calculated for each year and month between 2010 and 2085. The climate change scenarios for the Geer basin model predict hotter and drier summers and warmer and wetter winters. Considering the results of this study, it is very likely that groundwater levels and surface flow rates in the Geer basin will decrease by the end of the century. This is of concern because it also means that groundwater quantities available for abstraction will also decrease. However, this study also shows that the uncertainty of these projections is relatively large compared to the projected changes so that it remains difficult to confidently determine the magnitude of the decrease. The use and combination of an integrated surface - subsurface model and stochastic climate change scenarios has never been used in previous climate change impact studies on groundwater resources. It constitutes an innovation and is an important tool for helping water managers to take decisions.

  1. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Science.gov (United States)

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  2. The shareholder wealth effects of insurance securitization: preliminary evidence from the catastrophe bond market

    OpenAIRE

    Hagendorff, Bjoern; Hagendorff, Jens; Keasey, Kevin

    2013-01-01

    Insurance securitization has long been hailed as an important tool to increase theunderwriting capacity for companies exposed to catastrophe-related risks. However, globalvolumes of insurance securitization have remained surprisingly low to date which raisesquestions over its benefits. In this paper, we examine changes in the market value ofinsurance and reinsurance firms which announce their engagement in insurance securitizationby issuing catastrophe (Cat) bonds. Consistent with the hithert...

  3. Catastrophic disruptions as the origin of bilobate comets

    Science.gov (United States)

    Schwartz, Stephen R.; Michel, Patrick; Jutzi, Martin; Marchi, Simone; Zhang, Yun; Richardson, Derek C.

    2018-05-01

    Several comets observed at close range have bilobate shapes1, including comet 67P/Churyumov-Gerasimenko (67P/C-G), which was imaged by the European Space Agency's Rosetta mission2,3. Bilobate comets are thought to be primordial because they are rich in supervolatiles (for example, N2 and CO) and have a low bulk density, which implies that their formation requires a very low-speed accretion of two bodies. However, slow accretion does not only occur during the primordial phase of the Solar System; it can also occur at later epochs as part of the reaccumulation process resulting from the collisional disruption of a larger body4, so this cannot directly constrain the age of bilobate comets. Here, we show by numerical simulation that 67P/C-G and other elongated or bilobate comets can be formed in the wake of catastrophic collisional disruptions of larger bodies while maintaining their volatiles and low density throughout the process. Since this process can occur at any epoch of our Solar System's history, from early on through to the present day5, there is no need for these objects to be formed primordially. These findings indicate that observed prominent geological features, such as pits and stratified surface layers4,5, may not be primordial.

  4. Investigation for integration of the German Public Health Service in catastrophe and disaster prevention programs in Germany

    International Nuclear Information System (INIS)

    Pfenninger, E.; Koenig, S.; Himmelseher, S.

    2004-01-01

    This research project aimed at investigating the integration of the GPHS into the plans for civil defence and protection as well as catastrophe prevention of the Federal Republic of Germany. Following a comprehensive analysis of the current situation, potential proposals for an improved integrative approach will be presented. In view of the lack of topics relevant for medical care in disaster medicine in educational curricula and training programs for medical students and postgraduate board programs for public health physicians, a working group of the Civil Protection Board of the German Federal Ministry of the Interior already complained in their 'Report on execution of legal rules for protection and rescue of human life as well as restitution of public health after disaster' in 1999, that the integration of the GPHS into catastrophe and disaster prevention programs has insufficiently been solved. On a point-by-point approach, our project analysed the following issues: - Legislative acts for integration of the German Public Health Service into medical care in catastrophes and disasters to protect the civilian population of Germany and their implementation and execution. - Administrative rules and directives on state and district levels that show relationship to integration of the German Public Health Service into preparedness programs for catastrophe prevention and management and their implementation and execution. - Education and postgraduate training options for physicians and non-physician employees of the German Public health Service to prepare for medical care in catastrophes and disasters. - State of knowledge and experience of the German Public Health Service personnel in emergency and disaster medicine. - Evaluation of the German administrative catastrophe prevention authorities with regard to their integration of the German Public Health Service into preparedness programs for catastrophe prevention and management. - Development of a concept to remedy the

  5. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    International Nuclear Information System (INIS)

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  6. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  7. Child pain catastrophizing mediates the relation between parent responses to pain and disability in youth with functional abdominal pain.

    Science.gov (United States)

    Cunningham, Natoshia R; Lynch-Jordan, Anne; Barnett, Kimberly; Peugh, James; Sil, Soumitri; Goldschneider, Kenneth; Kashikar-Zuck, Susmita

    2014-12-01

    Functional abdominal pain (FAP) in youth is associated with substantial impairment in functioning, and prior research has shown that overprotective parent responses can heighten impairment. Little is known about how a range of parental behaviors (overprotection, minimizing, and/or encouragement) in response to their child's pain interact with child coping characteristics (eg, catastrophizing) to influence functioning in youth with FAP. In this study, it was hypothesized that the relation between parenting factors and child disability would be mediated by children's levels of maladaptive coping (ie, pain catastrophizing). Seventy-five patients with FAP presenting to a pediatric pain clinic and their caregivers participated in the study. Youth completed measures of pain intensity (Numeric Rating Scale), pain catastrophizing (Pain Catastrophizing Scale), and disability (Functional Disability Inventory). Caregivers completed measures of parent pain catastrophizing (Pain Catastrophizing Scale), and parent responses to child pain behaviors (Adult Responses to Child Symptoms: Protection, Minimizing, and Encouragement/Monitoring subscales). Increased functional disability was significantly related to higher child pain intensity, increased child and parent pain catastrophizing, and higher levels of encouragement/monitoring and protection. Parent minimization was not related to disability. Child pain catastrophizing fully mediated the relation between parent encouragement/monitoring and disability and partially mediated the relation between parent protectiveness and disability. The impact of parenting behaviors in response to FAP on child disability is determined, in part, by the child's coping style. Findings highlight a more nuanced understanding of the parent-child interaction in determining pain-related disability levels, which should be taken into consideration in assessing and treating youth with FAP.

  8. A catastrophe in quantum mechanics

    International Nuclear Information System (INIS)

    Ignatovich, V.K.

    2004-01-01

    The standard scattering theory (SST) in nonrelativistic quantum mechanics (QM) is analyzed. Self-contradictions of SST are deconstructed. A direct way to calculate scattering probability without introduction of a finite volume is discussed. Substantiation of SST in textbooks with the help of wave packets is shown to be incomplete. A complete theory of wave packet scattering on a fixed center is presented, and its similarity to the plane wave scattering is demonstrated. The neutron scattering on a monatomic gas is investigated, and several problems are pointed out. A catastrophic ambiguity of the cross section is revealed, and a way to resolve this ambiguity is discussed

  9. Uncertainty analysis of a coupled ecosystem response model simulating greenhouse gas fluxes from a temperate grassland

    Science.gov (United States)

    Liebermann, Ralf; Kraft, Philipp; Houska, Tobias; Breuer, Lutz; Müller, Christoph; Kraus, David; Haas, Edwin; Klatt, Steffen

    2015-04-01

    Among anthropogenic greenhouse gas emissions, CO2 is the dominant driver of global climate change. Next to its direct impact on the radiation budget, it also affects the climate system by triggering feedback mechanisms in terrestrial ecosystems. Such mechanisms - like stimulated photosynthesis, increased root exudations and reduced stomatal transpiration - influence both the input and the turnover of carbon and nitrogen compounds in the soil. The stabilization and decomposition of these compounds determines how increasing CO2 concentrations change the terrestrial trace gas emissions, especially CO2, N2O and CH4. To assess the potential reaction of terrestrial greenhouse gas emissions to rising tropospheric CO2 concentration, we make use of a comprehensive ecosystem model integrating known processes and fluxes of the carbon-nitrogen cycle in soil, vegetation and water. We apply a state-of-the-art ecosystem model with measurements from a long term field experiment of CO2 enrichment. The model - a grassland realization of LandscapeDNDC - simulates soil chemistry coupled with plant physiology, microclimate and hydrology. The data - comprising biomass, greenhouse gas emissions, management practices and soil properties - has been attained from a FACE (Free Air Carbon dioxide Enrichment) experiment running since 1997 on a temperate grassland in Giessen, Germany. Management and soil data, together with weather records, are used to drive the model, while cut biomass as well as CO2 and N2O emissions are used for calibration and validation. Starting with control data from installations without CO2 enhancement, we begin with a GLUE (General Likelihood Uncertainty Estimation) assessment using Latin Hypercube to reduce the range of the model parameters. This is followed by a detailed sensitivity analysis, the application of DREAM-ZS for model calibration, and an estimation of the effect of input uncertainty on the simulation results. Since first results indicate problems with

  10. 'spup' - An R package for uncertainty propagation in spatial environmental modelling

    NARCIS (Netherlands)

    Sawicka, K.; Heuvelink, G.B.M.

    2016-01-01

    Computer models are crucial tools in engineering and environmental sciences for simulating the behaviour of complex systems. While many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Advances in uncertainty analysis

  11. Pain Catastrophizing in Borderline Morbidly Obese and Morbidly Obese Individuals with Osteoarthritic Knee Pain

    Directory of Open Access Journals (Sweden)

    Tamara J Somers

    2008-01-01

    Full Text Available OBJECTIVE: There is limited information about how morbidly obese osteoarthritis (OA patients cope with the pain they experience. Pain catastrophizing is an important predictor of pain and adjustment in persons with persistent pain. This may be particularly relevant in the morbidly obese (body mass index [BMI] of 40 kg/m2 or greater OA population at risk for increased pain. The present study first examined whether borderline morbidly obese and morbidly obese OA patients report higher levels of pain catastrophizing than a sample of OA patients in the overweight and obese category (BMI between 25 kg/m2 and 34 kg/m2. Next, it examined how pain catastrophizing is related to important indexes of pain and adjustment in borderline morbidly obese and morbidly obese OA patients.

  12. The Application of Catastrophe Theory to Medical Image Analysis

    NARCIS (Netherlands)

    Kuijper, Arjan; Florack, L.M.J.

    2001-01-01

    In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of critical points under the influence of blurring. We show how the mathematical framework of catastrophe theory can be used to describe the various different types of

  13. Predictive uncertainty in auditory sequence processing

    Directory of Open Access Journals (Sweden)

    Niels Chr. eHansen

    2014-09-01

    Full Text Available Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty - a property of listeners’ prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure.Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex. Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty. We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty. Finally, we simulate listeners’ perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature.The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music.

  14. Predictive uncertainty in auditory sequence processing.

    Science.gov (United States)

    Hansen, Niels Chr; Pearce, Marcus T

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty-a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music.

  15. Managing uncertainty in multiple-criteria decision making related to sustainability assessment

    DEFF Research Database (Denmark)

    Dorini, Gianluca Fabio; Kapelan, Zoran; Azapagic, Adisa

    2011-01-01

    In real life, decisions are usually made by comparing different options with respect to several, often conflicting criteria. This requires subjective judgements on the importance of different criteria by DMs and increases uncertainty in decision making. This article demonstrates how uncertainty can......: (1) no uncertainty, (2) uncertainty in data/models and (3) uncertainty in models and decision-makers’ preferences. The results shows how characterising and propagating uncertainty can help increase the effectiveness of multi-criteria decision making processes and lead to more informed decision....... be handled in multi-criteria decision situations using Compromise Programming, one of the Multi-criteria Decision Analysis (MCDA) techniques. Uncertainty is characterised using a probabilistic approach and propagated using a Monte Carlo simulation technique. The methodological approach is illustrated...

  16. Catastrophe Optics Method to Determine the Micro-Nano Size Profiles at TPL of Liquid Films on a Solid Surface

    Science.gov (United States)

    Chao, David F.; McQuillen, J. B.; Sankovic, J. M.; Zhang, Nengli

    2009-01-01

    As discovered by recent studies, what directly affects the wetting and spreading is curvature in micro-region rather than the macroscopic contact angle. Measuring the profile of the micro-region becomes an important research topic. Recently, catastrophe optics has been applied to this kind of measurements. Optical catastrophe occurring in far field of waves of liquid-refracted laser beam implies a wealth of information about the liquid spreading not only for liquid drops but also for films. When a parallel laser beam passes through a liquid film on a slide glass at three-phase-line (TPL), very interesting optical image patterns occur on a screen far from the film. An analysis based on catastrophe optics discloses and interprets the formation of these optical image patterns. The analysis reveals that the caustic line manifested as the bright-thick line on the screen implies the lowest hierarchy of optical catastrophes, called fold caustic. This optical catastrophe is produced by the inflexion line on liquid surface at the liquid foot, which is formed not only in the spreading of drops but also in spreading of films. The generalized catastrophe optics method enables to identify the edge profiles and determine the edge foot height of liquid films. Keywords: Crossover region, Inflexion line, liquid edge foot, Catastrophe optics, Caustic and diffraction

  17. Multi data reservior history matching and uncertainty quantification framework

    KAUST Repository

    Katterbauer, Klemens

    2015-11-26

    A multi-data reservoir history matching and uncertainty quantification framework is provided. The framework can utilize multiple data sets such as production, seismic, electromagnetic, gravimetric and surface deformation data for improving the history matching process. The framework can consist of a geological model that is interfaced with a reservoir simulator. The reservoir simulator can interface with seismic, electromagnetic, gravimetric and surface deformation modules to predict the corresponding observations. The observations can then be incorporated into a recursive filter that subsequently updates the model state and parameters distributions, providing a general framework to quantify and eventually reduce with the data, uncertainty in the estimated reservoir state and parameters.

  18. Imaging findings in the rare catastrophic variant of the primary antiphospholipid syndrome

    Energy Technology Data Exchange (ETDEWEB)

    Thuerl, Christina; Altehoefer, Carsten; Laubenberger, Joerg [Freiburg Univ. (Germany). Abt. Radiologie; Spyridonidis, Alexandros [Freiburg Univ. (DE). Abt. Innere Medizin 1 (Haematologie und Onkologie)

    2002-03-01

    We report imaging findings in a case of the rare catastrophic variant of antiphospholipid syndrome (CAPS) characterized by widespread microvascular occlusions, which may lead to multiple organ failure. We present a case of a 66-year-old woman with bone marrow necrosis, acute acalculous cholecystitis (AAC), focal liver necrosis, subtle patchy splenic infarctions, and bilateral adrenal infarction. The demonstration of multiple microvascular organ involvement (three or more) is crucial for the diagnosis of the catastrophic variant of APS. This can be performed radiologically intra-vitam. Imaging can even reveal subclinical microinfarctions, which are often only diagnosed at autopsy. (orig.)

  19. Imaging findings in the rare catastrophic variant of the primary antiphospholipid syndrome

    International Nuclear Information System (INIS)

    Thuerl, Christina; Altehoefer, Carsten; Laubenberger, Joerg

    2002-01-01

    We report imaging findings in a case of the rare catastrophic variant of antiphospholipid syndrome (CAPS) characterized by widespread microvascular occlusions, which may lead to multiple organ failure. We present a case of a 66-year-old woman with bone marrow necrosis, acute acalculous cholecystitis (AAC), focal liver necrosis, subtle patchy splenic infarctions, and bilateral adrenal infarction. The demonstration of multiple microvascular organ involvement (three or more) is crucial for the diagnosis of the catastrophic variant of APS. This can be performed radiologically intra-vitam. Imaging can even reveal subclinical microinfarctions, which are often only diagnosed at autopsy. (orig.)

  20. Self-Efficacy for Pain Communication Moderates the Relation Between Ambivalence Over Emotional Expression and Pain Catastrophizing Among Patients With Osteoarthritis.

    Science.gov (United States)

    Van Denburg, Alyssa N; Shelby, Rebecca A; Caldwell, David S; O'Sullivan, Madeline L; Keefe, Francis J

    2018-04-06

    Pain catastrophizing (ie, the tendency to focus on and magnify pain sensations and feel helpless in the face of pain) is one of the most important and consistent psychological predictors of the pain experience. The present study examined, in 60 patients with osteoarthritis pain who were married or partnered: 1) the degree to which ambivalence over emotional expression and negative network orientation were associated with pain catastrophizing, and 2) whether self-efficacy for pain communication moderated these relations. Hierarchical multiple linear regression analyses revealed a significant main effect for the association between ambivalence over emotional expression and pain catastrophizing; as ambivalence over emotional expression increased, the degree of pain catastrophizing increased. In addition, the interaction between ambivalence over emotional expression and self-efficacy for pain communication was significant, such that as self-efficacy for pain communication increased, the association between ambivalence over emotional expression and pain catastrophizing became weaker. Negative network orientation was not significantly associated with pain catastrophizing. Findings suggest that higher levels of self-efficacy for pain communication may help weaken the effects of ambivalence over emotional expression on pain catastrophizing. In light of these results, patients may benefit from interventions that target pain communication processes and emotion regulation. This article examines interpersonal processes involved in pain catastrophizing. This study has the potential to lead to better understanding of maladaptive pain coping strategies and possibly better prevention and treatment strategies. Copyright © 2018 The American Pain Society. Published by Elsevier Inc. All rights reserved.