Optimal natural resources management under uncertainty with catastrophic risk
Energy Technology Data Exchange (ETDEWEB)
Motoh, Tsujimura [Graduate School of Economics, Kyoto University, Yoshida-honmochi, Sakyo-ku, Kyoto 606-8501 (Japan)
2004-05-01
We examine an optimal natural resources management problem under uncertainty with catastrophic risk and investigate the optimal rate of use of a natural resource. For this purpose, we use stochastic control theory. We assume that, until a catastrophic event occurs, the stock of the natural resource is governed by a stochastic differential equation. We describe the catastrophic phenomenon as a Poisson process. From this analysis, we show the optimal rate of use of the natural resource in explicit form. Furthermore, we present comparative static results for the optimal rate of use of the natural resource.
Optimal natural resources management under uncertainty with catastrophic risk
International Nuclear Information System (INIS)
Motoh, Tsujimura
2004-01-01
We examine an optimal natural resources management problem under uncertainty with catastrophic risk and investigate the optimal rate of use of a natural resource. For this purpose, we use stochastic control theory. We assume that, until a catastrophic event occurs, the stock of the natural resource is governed by a stochastic differential equation. We describe the catastrophic phenomenon as a Poisson process. From this analysis, we show the optimal rate of use of the natural resource in explicit form. Furthermore, we present comparative static results for the optimal rate of use of the natural resource
Sketching Uncertainty into Simulations.
Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E
2012-12-01
In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.
International Nuclear Information System (INIS)
Depres, B.; Dossantos-Uzarralde, P.
2009-01-01
More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers
The shape of uncertainty: underwriting decisions in the face of catastrophic risk
International Nuclear Information System (INIS)
Keykhah, M.
1998-01-01
This paper will explore how insurance and re-insurance underwriters price catastrophe risk from natural perils. It will first describe the theoretical nature of pricing risk, and outline studies of underwriting that propose analyzing decision making from a more behavioral than rational choice perspective. The paper then argues that in order to provide the appropriate context for probability (which is the focus of the studies on decision making under uncertainty), it may be helpful to look at the nature of choice within a market and organizational context. Moreover, the nature of probability itself is explored with a review to construct a broader analysis. Finally, it will be argued that the causal framework of the underwriter, in addition to inductive reasoning, provides a shape to uncertainty. (author)
Uncertainty Quantification in Aerodynamics Simulations, Phase I
National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...
Cryer, Patricia
1988-01-01
Develops models for participants' behaviors in games, simulations, and workshops based on Catastrophe Theory and Herzberg's two-factor theory of motivation. Examples are given of how these models can be used, both for describing and understanding the behaviors of individuals, and for eliciting insights into why participants behave as they do. (11…
Penetration of n-hexadecane and water into wood under conditions simulating catastrophic floods
Ganna Baglayeva; Wayne S. Seames; Charles R. Frihart; Jane O' Dell; Evguenii I. Kozliak
2017-01-01
To simulate fuel oil spills occurring during catastrophic floods, short-term absorption of two chemicals, n-hexadecane (representative of semivolatile organic compounds in fuel oil) and water, into southern yellow pine was gravimetrically monitored as a function of time at ambient conditions. Different scenarios were run on the basis of (1) the...
Quantification of uncertainties of modeling and simulation
International Nuclear Information System (INIS)
Ma Zhibo; Yin Jianwei
2012-01-01
The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)
Assessment of SFR Wire Wrap Simulation Uncertainties
Energy Technology Data Exchange (ETDEWEB)
Delchini, Marc-Olivier G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Popov, Emilian L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Swiler, Laura P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2016-09-30
Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results
Quantification of Uncertainty in Thermal Building Simulation
DEFF Research Database (Denmark)
Brohus, Henrik; Haghighat, F.; Frier, Christian
In order to quantify uncertainty in thermal building simulation stochastic modelling is applied on a building model. An application of stochastic differential equations is presented in Part 1 comprising a general heat balance for an arbitrary number of loads and zones in a building to determine...
Simulation of the catastrophic floods caused by extreme rainfall events - Uh River basin case study
Pekárová, Pavla; Halmová, Dana; Mitková, Veronika
2005-01-01
The extreme rainfall events in Central and East Europe on August 2002 rise the question, how other basins would respond on such rainfall situations. Such theorisation helps us to arrange in advance the necessary activity in the basin to reduce the consequence of the assumed disaster. The aim of the study is to recognise a reaction of the Uh River basin (Slovakia, Ukraine) to the simulated catastrophic rainfall events from August 2002. Two precipitation scenarios, sc1 and sc2, were created. Th...
Structural Uncertainty in Antarctic sea ice simulations
Schneider, D. P.
2016-12-01
The inability of the vast majority of historical climate model simulations to reproduce the observed increase in Antarctic sea ice has motivated many studies about the quality of the observational record, the role of natural variability versus forced changes, and the possibility of missing or inadequate forcings in the models (such as freshwater discharge from thinning ice shelves or an inadequate magnitude of stratospheric ozone depletion). In this presentation I will highlight another source of uncertainty that has received comparatively little attention: Structural uncertainty, that is, the systematic uncertainty in simulated sea ice trends that arises from model physics and mean-state biases. Using two large ensembles of experiments from the Community Earth System Model (CESM), I will show that the model is predisposed towards producing negative Antarctic sea ice trends during 1979-present, and that this outcome is not simply because the model's decadal variability is out-of-synch with that in nature. In the "Tropical Pacific Pacemaker" ensemble, in which observed tropical Pacific SST anomalies are prescribed, the model produces very realistic atmospheric circulation trends over the Southern Ocean, yet the sea ice trend is negative in every ensemble member. However, if the ensemble-mean trend (commonly interpreted as the forced response) is removed, some ensemble members show a sea ice increase that is very similar to the observed. While this results does confirm the important role of natural variability, it also suggests a strong bias in the forced response. I will discuss the reasons for this systematic bias and explore possible remedies. This an important problem to solve because projections of 21st -Century changes in the Antarctic climate system (including ice sheet surface mass balance changes and related changes in the sea level budget) have a strong dependence on the mean state of and changes in the Antarctic sea ice cover. This problem is not unique to
Methodology for characterizing modeling and discretization uncertainties in computational simulation
Energy Technology Data Exchange (ETDEWEB)
ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.
2000-03-01
This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.
Systematic uncertainties on Monte Carlo simulation of lead based ADS
International Nuclear Information System (INIS)
Embid, M.; Fernandez, R.; Garcia-Sanz, J.M.; Gonzalez, E.
1999-01-01
Computer simulations of the neutronic behaviour of ADS systems foreseen for actinide and fission product transmutation are affected by many sources of systematic uncertainties, both from the nuclear data and by the methodology selected when applying the codes. Several actual ADS Monte Carlo simulations are presented, comparing different options both for the data and for the methodology, evaluating the relevance of the different uncertainties. (author)
Uncertainty in simulating wheat yields under climate change
DEFF Research Database (Denmark)
Asseng, A; Ewert, F; Rosenzweig, C
2013-01-01
of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models...... than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi...
Uncertainties in the simulation of groundwater recharge at different scales
Directory of Open Access Journals (Sweden)
H. Bogena
2005-01-01
Full Text Available Digital spatial data always imply some kind of uncertainty. The source of this uncertainty can be found in their compilation as well as the conceptual design that causes a more or less exact abstraction of the real world, depending on the scale under consideration. Within the framework of hydrological modelling, in which numerous data sets from diverse sources of uneven quality are combined, the various uncertainties are accumulated. In this study, the GROWA model is taken as an example to examine the effects of different types of uncertainties on the calculated groundwater recharge. Distributed input errors are determined for the parameters' slope and aspect using a Monte Carlo approach. Landcover classification uncertainties are analysed by using the conditional probabilities of a remote sensing classification procedure. The uncertainties of data ensembles at different scales and study areas are discussed. The present uncertainty analysis showed that the Gaussian error propagation method is a useful technique for analysing the influence of input data on the simulated groundwater recharge. The uncertainties involved in the land use classification procedure and the digital elevation model can be significant in some parts of the study area. However, for the specific model used in this study it was shown that the precipitation uncertainties have the greatest impact on the total groundwater recharge error.
Uncertainty in Simulating Wheat Yields Under Climate Change
Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.;
2013-01-01
Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.
Indian Academy of Sciences (India)
To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...
The use of sequential indicator simulation to characterize geostatistical uncertainty
International Nuclear Information System (INIS)
Hansen, K.M.
1992-10-01
Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It is recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds
Uncertainty in Simulating Wheat Yields Under Climate Change
Energy Technology Data Exchange (ETDEWEB)
Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O' Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.
2013-09-01
Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.
Li, Sichen; Liao, Zhixian; Luo, Xiaoshu; Wei, Duqu; Jiang, Pinqun; Jiang, Qinghong
2018-02-01
The value of the output capacitance (C) should be carefully considered when designing a photovoltaic (PV) inverter since it can cause distortion in the working state of the circuit, and the circuit produces nonlinear dynamic behavior. According to Kirchhoff’s laws and the characteristics of an ideal operational amplifier for a strict piecewise linear state equation, a circuit simulation model is constructed to study the system parameters (time, C) for the current passing through an inductor with an inductance of L and the voltage across the capacitor with a capacitance of C. The developed simulation model uses Runge-Kutta methods to solve the state equations. This study focuses on predicting the fault of the circuit from the two aspects of the harmonic distortion and simulation results. Moreover, the presented model is also used to research the working state of the system in the case of a load capacitance catastrophe. The nonlinear dynamic behaviors in the inverter are simulated and verified.
Wave Energy Converter Annual Energy Production Uncertainty Using Simulations
Directory of Open Access Journals (Sweden)
Clayton E. Hiles
2016-09-01
Full Text Available Critical to evaluating the economic viability of a wave energy project is: (1 a robust estimate of the electricity production throughout the project lifetime and (2 an understanding of the uncertainty associated with said estimate. Standardization efforts have established mean annual energy production (MAEP as the metric for quantification of wave energy converter (WEC electricity production and the performance matrix approach as the appropriate method for calculation. General acceptance of a method for calculating the MAEP uncertainty has not yet been achieved. Several authors have proposed methods based on the standard engineering approach to error propagation, however, a lack of available WEC deployment data has restricted testing of these methods. In this work the magnitude and sensitivity of MAEP uncertainty is investigated. The analysis is driven by data from simulated deployments of 2 WECs of different operating principle at 4 different locations. A Monte Carlo simulation approach is proposed for calculating the variability of MAEP estimates and is used to explore the sensitivity of the calculation. The uncertainty of MAEP ranged from 2%–20% of the mean value. Of the contributing uncertainties studied, the variability in the wave climate was found responsible for most of the uncertainty in MAEP. Uncertainty in MAEP differs considerably between WEC types and between deployment locations and is sensitive to the length of the input data-sets. This implies that if a certain maximum level of uncertainty in MAEP is targeted, the minimum required lengths of the input data-sets will be different for every WEC-location combination.
An educational model for ensemble streamflow simulation and uncertainty analysis
Directory of Open Access Journals (Sweden)
A. AghaKouchak
2013-02-01
Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.
Measurement, simulation and uncertainty assessment of implant heating during MRI
International Nuclear Information System (INIS)
Neufeld, E; Kuehn, S; Kuster, N; Szekely, G
2009-01-01
The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.
Measurement, simulation and uncertainty assessment of implant heating during MRI
Energy Technology Data Exchange (ETDEWEB)
Neufeld, E; Kuehn, S; Kuster, N [Foundation for Research on Information Technologies in Society (IT' IS), Zeughausstr. 43, 8004 Zurich (Switzerland); Szekely, G [Computer Vision Laboratory, Swiss Federal Institute of Technology (ETHZ), Sternwartstr 7, ETH Zentrum, 8092 Zurich (Switzerland)], E-mail: neufeld@itis.ethz.ch
2009-07-07
The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.
DEFF Research Database (Denmark)
Kublitz, Anja
2013-01-01
to a distant past but takes place in the present. They use the term Nakba not only to refer to the catastrophe of 1948 but also to designate current catastrophes, such as the Danish Muhammad cartoons affair in 2005 and the Israeli invasion of Gaza in 2008. Through an analysis of the 60th commemoration...
Effect of monthly areal rainfall uncertainty on streamflow simulation
Ndiritu, J. G.; Mkhize, N.
2017-08-01
Areal rainfall is mostly obtained from point rainfall measurements that are sparsely located and several studies have shown that this results in large areal rainfall uncertainties at the daily time step. However, water resources assessment is often carried out a monthly time step and streamflow simulation is usually an essential component of this assessment. This study set out to quantify monthly areal rainfall uncertainties and assess their effect on streamflow simulation. This was achieved by; i) quantifying areal rainfall uncertainties and using these to generate stochastic monthly areal rainfalls, and ii) finding out how the quality of monthly streamflow simulation and streamflow variability change if stochastic areal rainfalls are used instead of historic areal rainfalls. Tests on monthly rainfall uncertainty were carried out using data from two South African catchments while streamflow simulation was confined to one of them. A non-parametric model that had been applied at a daily time step was used for stochastic areal rainfall generation and the Pitman catchment model calibrated using the SCE-UA optimizer was used for streamflow simulation. 100 randomly-initialised calibration-validation runs using 100 stochastic areal rainfalls were compared with 100 runs obtained using the single historic areal rainfall series. By using 4 rain gauges alternately to obtain areal rainfall, the resulting differences in areal rainfall averaged to 20% of the mean monthly areal rainfall and rainfall uncertainty was therefore highly significant. Pitman model simulations obtained coefficient of efficiencies averaging 0.66 and 0.64 in calibration and validation using historic rainfalls while the respective values using stochastic areal rainfalls were 0.59 and 0.57. Average bias was less than 5% in all cases. The streamflow ranges using historic rainfalls averaged to 29% of the mean naturalised flow in calibration and validation and the respective average ranges using stochastic
Michel, P.; Benz, W.; Richardson, D. C.
2005-08-01
Recent simulations of asteroid break-ups, including both the fragmentation of the parent body and the gravitational interactions of the fragments, have allowed to reproduced successfully the main properties of asteroid families formed in different regimes of impact energy. Here, using the same kind of simulations, we concentrate on a single regime of impact energy, the so-called catastrophic threshold usually designated by Qcrit, which results in the escape of half of the target's mass. Considering a wide range of diameter values and two kinds of internal structures of the parent body, monolithic and pre-shattered, we analyse their potential influences on the value of Qcrit and on the collisional outcome limited here to the fragment size and ejection speed distributions, which are the main outcome properties used by collisional models to study the evolutions of the different populations of small bodies. For all the considered diameters and the two internal structures of the parent body, we confirm that the process of gravitational reaccumulation is at the origin of the largest remnant's mass. We then find that, for a given diameter of the parent body, the impact energy corresponding to the catastrophic disruption threshold is highly dependent on the internal structure of the parent body. In particular, a pre-shattered parent body containing only damaged zones but no macroscopic voids is easier to disrupt than a monolithic parent body. Other kinds of internal properties that can also characterize small bodies in real populations will be investigated in a future work.
Uncertainty of input data for room acoustic simulations
DEFF Research Database (Denmark)
Jeong, Cheol-Ho; Marbjerg, Gerd; Brunskog, Jonas
2016-01-01
Although many room acoustic simulation models have been well established, simulation results will never be accurate with inaccurate and uncertain input data. This study addresses inappropriateness and uncertainty of input data for room acoustic simulations. Firstly, the random incidence absorption...... and scattering coefficients are insufficient when simulating highly non-diffuse rooms. More detailed information, such as the phase and angle dependence, can greatly improve the simulation results of pressure-based geometrical and wave-based models at frequencies well below the Schroeder frequency. Phase...... summarizes potential advanced absorption measurement techniques that can improve the quality of input data for room acoustic simulations. Lastly, plenty of uncertain input data are copied from unreliable sources. Software developers and users should be careful when spreading such uncertain input data. More...
Parameter Uncertainty on AGCM-simulated Tropical Cyclones
He, F.
2015-12-01
This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.
Michel, Patrick; Jutzi, M.; Richardson, D. C.; Benz, W.
2010-10-01
Asteroids of dark (e.g. C, D) taxonomic classes as well as Kuiper Belt objects and comets are believed to have high porosity, not only in the form of large voids but also in the form of micro-pores. The presence of such microscale porosity introduces additional physics in the impact process. We have enhanced our 3D SPH hydrocode, used to simulate catastrophic breakups, with a model of porosity [1] and validated it at small scale by comparison with impact experiments on pumice targets [2]. Our model is now ready to be applied to a large range of problems. In particular, accounting for the gravitational phase of an impact, we can study the formation of dark-type asteroid families, such as Veritas, and Kuiper-Belt families, such as Haumea. Recently we characterized for the first time the catastrophic impact energy threshold, usually called Q*D, as a function of the target's diameter, porosity, material strength and impact speed [3]. Regarding the mentioned families, our preliminary results show that accounting for porosity leads to different outcomes that may better represent their properties and constrain their definition. In particular, for Veritas, we find that its membership may need some revision [4]. The parameter space is still large, many interesting families need to be investigated and our model will be applied to a large range of cases. PM, MJ and DCR acknowledge financial support from the French Programme National de Planétologie, NASA PG&G "Small Bodies and Planetary Collisions" and NASA under Grant No. NNX08AM39G issued through the Office of Space Science, respectively. [1] Jutzi et al. 2008. Icarus 198, 242-255; [2] Jutzi et al. 2009. Icarus 201, 802-813; [3] Jutzi et al. 2010. Fragment properties at the catastrophic disruption threshold: The effect of the parent body's internal structure, Icarus 207, 54-65; [4] Michel et al. 2010. Icarus, submitted.
Quantifying chemical uncertainties in simulations of the ISM
Glover, Simon
2018-06-01
The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.
Uncertainty quantification in ion–solid interaction simulations
Energy Technology Data Exchange (ETDEWEB)
Preuss, R., E-mail: preuss@ipp.mpg.de; Toussaint, U. von
2017-02-15
Within the framework of Bayesian uncertainty quantification we propose a non-intrusive reduced-order spectral approach (polynomial chaos expansion) to the simulation of ion–solid interactions. The method not only reduces the number of function evaluations but provides simultaneously a quantitative measure for which combinations of inputs have the most important impact on the result. It is applied to SDTRIM-simulations (Möller et al., 1988) with several uncertain and Gaussian distributed input parameters (i.e. angle, projectile energy, surface binding energy, target composition) and the results are compared to full-grid based approaches and sampling based methods with respect to reliability, efficiency and scalability.
Budyko, Mikhail
1999-05-01
Climate catastrophes, which many times occurred in the geological past, caused the extinction of large or small populations of animals and plants. Changes in the terrestrial and marine biota caused by the catastrophic climate changes undoubtedly resulted in considerable fluctuations in global carbon cycle and atmospheric gas composition. Primarily, carbon dioxide and other greenhouse gas contents were affected. The study of these catastrophes allows a conclusion that climate system is very sensitive to relatively small changes in climate-forcing factors (transparency of the atmosphere, changes in large glaciations, etc.). It is important to take this conclusion into account while estimating the possible consequences of now occurring anthropogenic warming caused by the increase in greenhouse gas concentration in the atmosphere.
Uncertainty analysis of NDA waste measurements using computer simulations
International Nuclear Information System (INIS)
Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.
2000-01-01
Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of
Parameter uncertainty in simulations of extreme precipitation and attribution studies.
Timmermans, B.; Collins, W. D.; O'Brien, T. A.; Risser, M. D.
2017-12-01
The attribution of extreme weather events, such as heavy rainfall, to anthropogenic influence involves the analysis of their probability in simulations of climate. The climate models used however, such as the Community Atmosphere Model (CAM), employ approximate physics that gives rise to "parameter uncertainty"—uncertainty about the most accurate or optimal values of numerical parameters within the model. In particular, approximate parameterisations for convective processes are well known to be influential in the simulation of precipitation extremes. Towards examining the impact of this source of uncertainty on attribution studies, we investigate the importance of components—through their associated tuning parameters—of parameterisations relating to deep and shallow convection, and cloud and aerosol microphysics in CAM. We hypothesise that as numerical resolution is increased the change in proportion of variance induced by perturbed parameters associated with the respective components is consistent with the decreasing applicability of the underlying hydrostatic assumptions. For example, that the relative influence of deep convection should diminish as resolution approaches that where convection can be resolved numerically ( 10 km). We quantify the relationship between the relative proportion of variance induced and numerical resolution by conducting computer experiments that examine precipitation extremes over the contiguous U.S. In order to mitigate the enormous computational burden of running ensembles of long climate simulations, we use variable-resolution CAM and employ both extreme value theory and surrogate modelling techniques ("emulators"). We discuss the implications of the relationship between parameterised convective processes and resolution both in the context of attribution studies and progression towards models that fully resolve convection.
Optimizing Grippers for Compensating Pose Uncertainties by Dynamic Simulation
DEFF Research Database (Denmark)
Wolniakowski, Adam; Kramberger, Aljaž; Gams, Andrej
2017-01-01
Gripper design process is one of the interesting challenges in the context of grasping within industry. Typically, simple parallel-finger grippers, which are easy to install and maintain, are used in platforms for robotic grasping. The context switches in these platforms require frequent exchange......, we have presented a method to automatically compute the optimal finger shapes for defined task contexts in simulation. In this paper, we show the performance of our method in an industrial grasping scenario. We first analyze the uncertainties of the used vision system, which are the major source...
Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area
Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.
2016-02-01
We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135 × 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101 × 104 km2). However the uncertainty (1 to 128 × 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future permafrost distribution can be made for
On sociological catastrophe analysis
International Nuclear Information System (INIS)
Clausen, L.
1974-01-01
The present paper deals with standard terms of sociological catastrophe theory hitherto existing, collective behaviour during the catastrophe, and consequences for the empiric catastrophe sociology. (RW) [de
DEFF Research Database (Denmark)
Toledo, Luis; Neelsen, Kai John; Lukas, Jiri
2017-01-01
Proliferating cells rely on the so-called DNA replication checkpoint to ensure orderly completion of genome duplication, and its malfunction may lead to catastrophic genome disruption, including unscheduled firing of replication origins, stalling and collapse of replication forks, massive DNA...... breakage, and, ultimately, cell death. Despite many years of intensive research into the molecular underpinnings of the eukaryotic replication checkpoint, the mechanisms underlying the dismal consequences of its failure remain enigmatic. A recent development offers a unifying model in which the replication...... checkpoint guards against global exhaustion of rate-limiting replication regulators. Here we discuss how such a mechanism can prevent catastrophic genome disruption and suggest how to harness this knowledge to advance therapeutic strategies to eliminate cancer cells that inherently proliferate under...
Propagation of radar rainfall uncertainty in urban flood simulations
Liguori, Sara; Rico-Ramirez, Miguel
2013-04-01
hydrodynamic sewer network model implemented in the Infoworks software was used to model the rainfall-runoff process in the urban area. The software calculates the flow through the sewer conduits of the urban model using rainfall as the primary input. The sewer network is covered by 25 radar pixels with a spatial resolution of 1 km2. The majority of the sewer system is combined, carrying both urban rainfall runoff as well as domestic and trade waste water [11]. The urban model was configured to receive the probabilistic radar rainfall fields. The results showed that the radar rainfall ensembles provide additional information about the uncertainty in the radar rainfall measurements that can be propagated in urban flood modelling. The peaks of the measured flow hydrographs are often bounded within the uncertainty area produced by using the radar rainfall ensembles. This is in fact one of the benefits of using radar rainfall ensembles in urban flood modelling. More work needs to be done in improving the urban models, but this is out of the scope of this research. The rainfall uncertainty cannot explain the whole uncertainty shown in the flow simulations, and additional sources of uncertainty will come from the structure of the urban models as well as the large number of parameters required by these models. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and the UK Environment Agency for providing the various data sets. We also thank Yorkshire Water Services Ltd for providing the urban model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1. References [1] Browning KA, 1978. Meteorological applications of radar. Reports on Progress in Physics 41 761 Doi: 10.1088/0034-4885/41/5/003 [2] Rico-Ramirez MA, Cluckie ID, Shepherd G, Pallot A, 2007. A high-resolution radar experiment on the island of Jersey. Meteorological Applications 14: 117-129. [3] Villarini G, Krajewski WF
Institute of Scientific and Technical Information of China (English)
徐岩; 胡斌
2012-01-01
The evolution process of partners' strategies in strategic alliances with multi-firm was considered by evolutionary game theory perspective. A deterministic dynamical equation is developed, based on which, the Gaussian White noise is introduced to show the disturbance, and a stochastic dynamical equation is created. The catastrophe of strategic alliances that ranges cooperation to betrayal in the process is analyzed by means of stochastic catastrophe theory. The catastrophe set of control variables is found to explain and forecast the catastrophe of strategic alliances. To validate the correctness of the model, some numerical simulations are given in different scenarios, and it is evident from the illustrations that the behavior of the strategic alliances encounters catastrophe near the catastrophe set.%针对多成员战略联盟在不确定环境下策略的演化过程,借助演化博弈论建立了含有白噪声的随机动力学.利用随机突变理论来分析在不确定性条件下,联盟成员行为(竞争或合作)随着参数的连续变化在整体上发生突变的问题,给出了联盟发生突变的临界集,以此来解释和预测在不确定性环境下,战略联盟发生非计划性解体或者合作失败的突发性问题.对不同场景下的模型进行了数值仿真,结果表明,在临界集附近,联盟集体的行为发生了突变.
An Uncertainty Structure Matrix for Models and Simulations
Green, Lawrence L.; Blattnig, Steve R.; Hemsch, Michael J.; Luckring, James M.; Tripathi, Ram K.
2008-01-01
Software that is used for aerospace flight control and to display information to pilots and crew is expected to be correct and credible at all times. This type of software is typically developed under strict management processes, which are intended to reduce defects in the software product. However, modeling and simulation (M&S) software may exhibit varying degrees of correctness and credibility, depending on a large and complex set of factors. These factors include its intended use, the known physics and numerical approximations within the M&S, and the referent data set against which the M&S correctness is compared. The correctness and credibility of an M&S effort is closely correlated to the uncertainty management (UM) practices that are applied to the M&S effort. This paper describes an uncertainty structure matrix for M&S, which provides a set of objective descriptions for the possible states of UM practices within a given M&S effort. The columns in the uncertainty structure matrix contain UM elements or practices that are common across most M&S efforts, and the rows describe the potential levels of achievement in each of the elements. A practitioner can quickly look at the matrix to determine where an M&S effort falls based on a common set of UM practices that are described in absolute terms that can be applied to virtually any M&S effort. The matrix can also be used to plan those steps and resources that would be needed to improve the UM practices for a given M&S effort.
Simulation codes and the impact of validation/uncertainty requirements
International Nuclear Information System (INIS)
Sills, H.E.
1995-01-01
Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)
Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area
Wang, A.; Moore, J.C.; Cui, Xingquan; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D.M.; McGuire, A.D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.
2016-01-01
We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135 × 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101 × 104 km2). However the uncertainty (1 to 128 × 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future
Catastrophe medicine; Medecine de catastrophe
Energy Technology Data Exchange (ETDEWEB)
Lebreton, A. [Service Technique de l`Energie Electrique et des Grands Barrages (STEEGB), (France)
1996-12-31
The `Catastrophe Medicine` congress which took place in Amiens (France) in December 5 to 7 1996 was devoted to the assessment and management of risks and hazards in natural and artificial systems. The methods of risk evaluation and prevision were discussed in the context of dams accidents with the analysis of experience feedbacks and lessons gained from the organisation of emergency plans. Three round table conferences were devoted to the importance of psychological aspects during such major crises. (J.S.)
Wheeler, J. Craig
2014-08-01
Preface; 1. Setting the stage: star formation and hydrogen burning in single stars; 2. Stellar death: the inexorable grip of gravity; 3. Dancing with stars: binary stellar evolution; 4. Accretion disks: flat stars; 5. White Dwarfs: quantum dots; 6. Supernovae: stellar catastrophes; 7. Supernova 1987A: lessons and enigmas; 8. Neutron stars: atoms with attitude; 9. Black holes in theory: into the abyss; 10. Black holes in fact: exploring the reality; 11. Gamma-ray bursts, black holes and the universe: long, long ago and far, far away; 12. Supernovae and the universe; 13. Worm holes and time machines: tunnels in space and time; 14. Beyond: the frontiers; Index.
Derivative-free optimization under uncertainty applied to costly simulators
International Nuclear Information System (INIS)
Pauwels, Benoit
2016-01-01
The modeling of complex phenomena encountered in industrial issues can lead to the study of numerical simulation codes. These simulators may require extensive execution time (from hours to days), involve uncertain parameters and even be intrinsically stochastic. Importantly within the context of simulation-based optimization, the derivatives of the outputs with respect to the inputs may be inexistent, inaccessible or too costly to approximate reasonably. This thesis is organized in four chapters. The first chapter discusses the state of the art in derivative-free optimization and uncertainty modeling. The next three chapters introduce three independent - although connected - contributions to the field of derivative-free optimization in the presence of uncertainty. The second chapter addresses the emulation of costly stochastic simulation codes - stochastic in the sense simulations run with the same input parameters may lead to distinct outputs. Such was the matter of the CODESTOCH project carried out at the Summer mathematical research center on scientific computing and its applications (CEMRACS) during the summer of 2013, together with two Ph.D. students from Electricity of France (EDF) and the Atomic Energy and Alternative Energies Commission (CEA). We designed four methods to build emulators for functions whose values are probability density functions. These methods were tested on two toy functions and applied to industrial simulation codes concerned with three complex phenomena: the spatial distribution of molecules in a hydrocarbon system (IFPEN), the life cycle of large electric transformers (EDF) and the repercussions of a hypothetical accidental in a nuclear plant (CEA). Emulation was a preliminary process towards optimization in the first two cases. In the third chapter we consider the influence of inaccurate objective function evaluations on direct search - a classical derivative-free optimization method. In real settings inaccuracy may never vanish
Farrance, Ian; Frenkel, Robert
2014-01-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional
Farrance, Ian; Frenkel, Robert
2014-02-01
The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship
Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease
Marsden, Alison
2009-11-01
Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.
International Nuclear Information System (INIS)
Morales Prieto, M.; Ortega Saiz, P.
2011-01-01
Analysis of analytical uncertainties of the methodology of simulation of processes for obtaining isotopic ending inventory of spent fuel, the ARIANE experiment explores the part of simulation of burning.
Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...
International Nuclear Information System (INIS)
Silva, T.A. da
1988-01-01
The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt
Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions
International Nuclear Information System (INIS)
Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei
2017-01-01
Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.
Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions
Energy Technology Data Exchange (ETDEWEB)
Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)
2017-04-15
Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.
Uncertainty in simulating wheat yields under climate change : Letter
Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Supit, I.
2013-01-01
Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic
DEFF Research Database (Denmark)
Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan
2009-01-01
The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predict...
Directory of Open Access Journals (Sweden)
Przemysław Czapliński
2015-01-01
Full Text Available The principal notion of the article–a “backward catastrophe”– stands for a catastrophe which occurs unseen until it becomes recognized and which broadens its destructive activity until it has been recognized. This concept in the article has been referred to the Shoah. The main thesis is that the recognition of the actual influence of the Holocaust began in Polish culture in the mid-1980s (largely it started with the film by Claude Lanzmann Shoah and the essay by Jan Błoński Biedni Polacy patrzą na getto [“The Poor Poles Look at the Ghetto”], that is when the question: “What happened to the Jews”, assumes the form: “Did the things that happened to the Jews, also happened to the Poles?”. Cognitive and ethical reorientation leads to the revealing of the hidden consequences of the Holocaust reaching as far as the present day and undermining the foundations of collective identity. In order to understand this situation (and adopt potentially preventive actions Polish society should be recognized as a postcatastrophic one.
Modeling, design, and simulation of systems with uncertainties
Rauh, Andreas
2011-01-01
This three-fold contribution to the field covers both theory and current research in algorithmic approaches to uncertainty handling, real-life applications such as robotics and biomedical engineering, and fresh approaches to reliably implementing software.
International Nuclear Information System (INIS)
Velichenko, V.V.
1994-01-01
The problem of catastrophe control is discussed. Catastrophe control aims to withdraw responsible engineering constructions out of the catastrophe. The mathematical framework of catastrophes control systems is constructed. It determines the principles of systems filling by the concrete physical contents and, simultaneously, permits to employ modern control methods for the synthesis of optimal withdrawal strategy for protected objects
Evidence-based quantification of uncertainties induced via simulation-based modeling
International Nuclear Information System (INIS)
Riley, Matthew E.
2015-01-01
The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems
Uncertainty in prediction and simulation of flow in sewer systems
DEFF Research Database (Denmark)
Breinholt, Anders
the uncertainty in the state variables. Additionally the observation noise is accounted for by a separate observation noise term. This approach is also referred to as stochastic grey-box modelling. A state dependent diffusion term was developed using a Lamperti transformation of the states, and implemented...... performance beyond the one-step. The reliability was satisfied for the one-step prediction but were increasingly biased as the prediction horizon was expanded, particularly in rainy periods. GLUE was applied for estimating uncertainty in such a way that the selection of behavioral parameter sets continued....... Conversely the parameter estimates of the stochastic approach are physically meaningful. This thesis has contributed to developing simplified rainfall-runoff models that are suitable for model predictive control of urban drainage systems that takes uncertainty into account....
Fukushinobyl, the impossible catastrophe
International Nuclear Information System (INIS)
Boceno, Laurent
2012-01-01
With the emergence of variety of health and environmental crisis or catastrophes (Seveso, Bhopal, Chernobyl, AIDS, contaminated blood, mad cow, influenzas), the author proposes thoughts about the fact that it seems we are not in the era of industrial societies any longer, but in that of societies of risk. He more particularly focuses on Chernobyl and Fukushima to analyse how a social framework is built up to integrate forms of institutionalisation of multifaceted vulnerability, these institutional logics becoming latent social pathologies. In this respect, he more particularly discusses the catastrophic share of nuclear. He shows how what can be considered as a risk is socialised, dissimulated by priority, and then addresses the management of consequences of Chernobyl and how it is used to address the Japanese present situation. He notably outlines a kind of collusion between the WHO and the IAEA about nuclear issues. In his respect, he recalls a statement made by the WHO saying that, from a mental health point of view, the most satisfying solution for the future of pacific uses of nuclear energy would be the emergence of a new generation who would have learned to cope with ignorance and uncertainty
Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation
Machguth, H.; Purves, R.S.; Oerlemans, J.; Hoelzle, M.; Paul, F.
2008-01-01
By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was
Wit, de A.J.W.; Bruin, de S.
2006-01-01
Previous analyses of the effects of uncertainty in precipitation fields on the output of EU Crop Growth Monitoring System (CGMS) demonstrated that the influence on simulated crop yield was limited at national scale, but considerable at local and regional scales. We aim to propagate uncertainty due
The validation of evacuation simulation models through the analysis of behavioural uncertainty
International Nuclear Information System (INIS)
Lovreglio, Ruggiero; Ronchi, Enrico; Borri, Dino
2014-01-01
Both experimental and simulation data on fire evacuation are influenced by a component of uncertainty caused by the impact of the unexplained variance in human behaviour, namely behavioural uncertainty (BU). Evacuation model validation studies should include the study of this type of uncertainty during the comparison of experiments and simulation results. An evacuation model validation procedure is introduced in this paper to study the impact of BU. This methodology is presented through a case study for the comparison between repeated experimental data and simulation results produced by FDS+Evac, an evacuation model for the simulation of human behaviour in fire, which makes use of distribution laws. - Highlights: • Validation of evacuation models is investigated. • Quantitative evaluation of behavioural uncertainty is performed. • A validation procedure is presented through an evacuation case study
CSIR Research Space (South Africa)
Bidgood, Peter M
2017-01-01
Full Text Available The estimation of balance uncertainty using conventional statistical and error propagation methods has been found to be both approximate and laborious to the point of being untenable. Direct Simulation by Monte Carlo (DSMC) has been shown...
Error and Uncertainty Analysis for Ecological Modeling and Simulation
2001-12-01
nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G...D. Goolsby 2001. Relating N inputs to the Mississippi River Basin and nitrate flux in the Lower Mississippi River: A comparison of approaches...Journal of Remote Sensing, 25(4):367-380. Wu, J., D.E. Jelinski, M. Luck, and P.T. Tueller, 2000. Multiscale analysis of landscape heterogeneity: scale
Effects of Boron and Graphite Uncertainty in Fuel for TREAT Simulations
Energy Technology Data Exchange (ETDEWEB)
Vaughn, Kyle; Mausolff, Zander; Gonzalez, Esteban; DeHart, Mark; Goluoglu, Sedat
2017-03-01
Advanced modeling techniques and current computational capacity make full core TREAT simulations possible, with the goal of such simulations to understand the pre-test core and minimize the number of required calibrations. But, in order to simulate TREAT with a high degree of precision the reactor materials and geometry must also be modeled with a high degree of precision. This paper examines how uncertainty in the reported values of boron and graphite have an effect on simulations of TREAT.
ASSESSING ASTROPHYSICAL UNCERTAINTIES IN DIRECT DETECTION WITH GALAXY SIMULATIONS
International Nuclear Information System (INIS)
Sloane, Jonathan D.; Buckley, Matthew R.; Brooks, Alyson M.; Governato, Fabio
2016-01-01
We study the local dark matter velocity distribution in simulated Milky Way-mass galaxies, generated at high resolution with both dark matter and baryons. We find that the dark matter in the solar neighborhood is influenced appreciably by the inclusion of baryons, increasing the speed of dark matter particles compared to dark matter-only simulations. The gravitational potential due to the presence of a baryonic disk increases the amount of high velocity dark matter, resulting in velocity distributions that are more similar to the Maxwellian Standard Halo Model than predicted from dark matter-only simulations. Furthermore, the velocity structures present in baryonic simulations possess a greater diversity than expected from dark matter-only simulations. We show that the impact on the direct detection experiments LUX, DAMA/Libra, and CoGeNT using our simulated velocity distributions, and explore how resolution and halo mass within the Milky Way’s estimated mass range impact the results. A Maxwellian fit to the velocity distribution tends to overpredict the amount of dark matter in the high velocity tail, even with baryons, and thus leads to overly optimistic direct detection bounds on models that are dependent on this region of phase space for an experimental signal. Our work further demonstrates that it is critical to transform simulated velocity distributions to the lab frame of reference, due to the fact that velocity structure in the solar neighborhood appears when baryons are included. There is more velocity structure present when baryons are included than in dark matter-only simulations. Even when baryons are included, the importance of the velocity structure is not as apparent in the Galactic frame of reference as in the Earth frame.
ASSESSING ASTROPHYSICAL UNCERTAINTIES IN DIRECT DETECTION WITH GALAXY SIMULATIONS
Energy Technology Data Exchange (ETDEWEB)
Sloane, Jonathan D.; Buckley, Matthew R.; Brooks, Alyson M. [Department of Physics and Astronomy, Rutgers University, Piscataway, NJ 08854 (United States); Governato, Fabio [Astronomy Department, University of Washington, Box 351580, Seattle, WA 98195-1580 (United States)
2016-11-01
We study the local dark matter velocity distribution in simulated Milky Way-mass galaxies, generated at high resolution with both dark matter and baryons. We find that the dark matter in the solar neighborhood is influenced appreciably by the inclusion of baryons, increasing the speed of dark matter particles compared to dark matter-only simulations. The gravitational potential due to the presence of a baryonic disk increases the amount of high velocity dark matter, resulting in velocity distributions that are more similar to the Maxwellian Standard Halo Model than predicted from dark matter-only simulations. Furthermore, the velocity structures present in baryonic simulations possess a greater diversity than expected from dark matter-only simulations. We show that the impact on the direct detection experiments LUX, DAMA/Libra, and CoGeNT using our simulated velocity distributions, and explore how resolution and halo mass within the Milky Way’s estimated mass range impact the results. A Maxwellian fit to the velocity distribution tends to overpredict the amount of dark matter in the high velocity tail, even with baryons, and thus leads to overly optimistic direct detection bounds on models that are dependent on this region of phase space for an experimental signal. Our work further demonstrates that it is critical to transform simulated velocity distributions to the lab frame of reference, due to the fact that velocity structure in the solar neighborhood appears when baryons are included. There is more velocity structure present when baryons are included than in dark matter-only simulations. Even when baryons are included, the importance of the velocity structure is not as apparent in the Galactic frame of reference as in the Earth frame.
Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V
2009-01-01
The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.
Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes
International Nuclear Information System (INIS)
Garcia J, T.; Cardenas V, J.
2015-09-01
A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the
Multimodel Uncertainty Changes in Simulated River Flows Induced by Human Impact Parameterizations
Liu, Xingcai; Tang, Qiuhong; Cui, Huijuan; Mu, Mengfei; Gerten Dieter; Gosling, Simon; Masaki, Yoshimitsu; Satoh, Yusuke; Wada, Yoshihide
2017-01-01
Human impacts increasingly affect the global hydrological cycle and indeed dominate hydrological changes in some regions. Hydrologists have sought to identify the human-impact-induced hydrological variations via parameterizing anthropogenic water uses in global hydrological models (GHMs). The consequently increased model complexity is likely to introduce additional uncertainty among GHMs. Here, using four GHMs, between-model uncertainties are quantified in terms of the ratio of signal to noise (SNR) for average river flow during 1971-2000 simulated in two experiments, with representation of human impacts (VARSOC) and without (NOSOC). It is the first quantitative investigation of between-model uncertainty resulted from the inclusion of human impact parameterizations. Results show that the between-model uncertainties in terms of SNRs in the VARSOC annual flow are larger (about 2 for global and varied magnitude for different basins) than those in the NOSOC, which are particularly significant in most areas of Asia and northern areas to the Mediterranean Sea. The SNR differences are mostly negative (-20 to 5, indicating higher uncertainty) for basin-averaged annual flow. The VARSOC high flow shows slightly lower uncertainties than NOSOC simulations, with SNR differences mostly ranging from -20 to 20. The uncertainty differences between the two experiments are significantly related to the fraction of irrigation areas of basins. The large additional uncertainties in VARSOC simulations introduced by the inclusion of parameterizations of human impacts raise the urgent need of GHMs development regarding a better understanding of human impacts. Differences in the parameterizations of irrigation, reservoir regulation and water withdrawals are discussed towards potential directions of improvements for future GHM development. We also discuss the advantages of statistical approaches to reduce the between-model uncertainties, and the importance of calibration of GHMs for not only
Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2
Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.;
2016-01-01
Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.
Event-by-event simulation of single-neutron experiments to test uncertainty relations
International Nuclear Information System (INIS)
Raedt, H De; Michielsen, K
2014-01-01
Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not require the knowledge of the solution of a wave equation, nor does it rely on detailed concepts of quantum theory. In particular, the data from these non-quantum simulations satisfy uncertainty relations derived in the context of quantum theory. (paper)
Evaluation of uncertainties in regional climate change simulations
DEFF Research Database (Denmark)
Pan, Z.; Christensen, J. H.; Arritt, R. W.
2001-01-01
, an atmosphere-ocean coupled general circulation model (GCM) current climate, and a future scenario of transient climate change. Common precipitation climatology features simulated by both models included realistic orographic precipitation, east-west transcontinental gradients, and reasonable annual cycles over...... to different subgrid scale processes in individual models. The ratio of climate change to biases, which we use as one measure of confidence in projected climate changes, is substantially larger than 1 in several seasons and regions while the ratios are always less than 1 in summer. The largest ratios among all...... regions are in California. Spatial correlation coefficients of precipitation were computed between simulation pairs in the 2x3 set. The climate change correlation is highest and the RCM performance correlation is lowest while boundary forcing and intermodel correlations are intermediate. The high spatial...
Neural network stochastic simulation applied for quantifying uncertainties
Directory of Open Access Journals (Sweden)
N Foudil-Bey
2016-09-01
Full Text Available Generally the geostatistical simulation methods are used to generate several realizations of physical properties in the sub-surface, these methods are based on the variogram analysis and limited to measures correlation between variables at two locations only. In this paper, we propose a simulation of properties based on supervised Neural network training at the existing drilling data set. The major advantage is that this method does not require a preliminary geostatistical study and takes into account several points. As a result, the geological information and the diverse geophysical data can be combined easily. To do this, we used a neural network with multi-layer perceptron architecture like feed-forward, then we used the back-propagation algorithm with conjugate gradient technique to minimize the error of the network output. The learning process can create links between different variables, this relationship can be used for interpolation of the properties on the one hand, or to generate several possible distribution of physical properties on the other hand, changing at each time and a random value of the input neurons, which was kept constant until the period of learning. This method was tested on real data to simulate multiple realizations of the density and the magnetic susceptibility in three-dimensions at the mining camp of Val d'Or, Québec (Canada.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Directory of Open Access Journals (Sweden)
Artem Yankov
2012-01-01
Full Text Available For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor and in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.
Monte Carlo Simulation of Influence of Input Parameters Uncertainty on Output Data
International Nuclear Information System (INIS)
Sobek, Lukas
2010-01-01
Input parameters of a complex system in the probabilistic simulation are treated by means of probability density function (PDF). The result of the simulation have also probabilistic character. Monte Carlo simulation is widely used to obtain predictions concerning the probability of the risk. The Monte Carlo method was performed to calculate histograms of PDF for release rate given by uncertainty in distribution coefficient of radionuclides 135 Cs and 235 U.
Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations
Niemeier, Wolfgang; Tengen, Dieter
2017-06-01
In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.
Epistemic uncertainty in California-wide synthetic seismicity simulations
Pollitz, Fred F.
2011-01-01
The generation of seismicity catalogs on synthetic fault networks holds the promise of providing key inputs into probabilistic seismic-hazard analysis, for example, the coefficient of variation, mean recurrence time as a function of magnitude, the probability of fault-to-fault ruptures, and conditional probabilities for foreshock–mainshock triggering. I employ a seismicity simulator that includes the following ingredients: static stress transfer, viscoelastic relaxation of the lower crust and mantle, and vertical stratification of elastic and viscoelastic material properties. A cascade mechanism combined with a simple Coulomb failure criterion is used to determine the initiation, propagation, and termination of synthetic ruptures. It is employed on a 3D fault network provided by Steve Ward (unpublished data, 2009) for the Southern California Earthquake Center (SCEC) Earthquake Simulators Group. This all-California fault network, initially consisting of 8000 patches, each of ∼12 square kilometers in size, has been rediscretized into Graphic patches, each of ∼1 square kilometer in size, in order to simulate the evolution of California seismicity and crustal stress at magnitude M∼5–8. Resulting synthetic seismicity catalogs spanning 30,000 yr and about one-half million events are evaluated with magnitude-frequency and magnitude-area statistics. For a priori choices of fault-slip rates and mean stress drops, I explore the sensitivity of various constructs on input parameters, particularly mantle viscosity. Slip maps obtained for the southern San Andreas fault show that the ability of segment boundaries to inhibit slip across the boundaries (e.g., to prevent multisegment ruptures) is systematically affected by mantle viscosity.
Range uncertainties in proton therapy and the role of Monte Carlo simulations
International Nuclear Information System (INIS)
Paganetti, Harald
2012-01-01
The main advantages of proton therapy are the reduced total energy deposited in the patient as compared to photon techniques and the finite range of the proton beam. The latter adds an additional degree of freedom to treatment planning. The range in tissue is associated with considerable uncertainties caused by imaging, patient setup, beam delivery and dose calculation. Reducing the uncertainties would allow a reduction of the treatment volume and thus allow a better utilization of the advantages of protons. This paper summarizes the role of Monte Carlo simulations when aiming at a reduction of range uncertainties in proton therapy. Differences in dose calculation when comparing Monte Carlo with analytical algorithms are analyzed as well as range uncertainties due to material constants and CT conversion. Range uncertainties due to biological effects and the role of Monte Carlo for in vivo range verification are discussed. Furthermore, the current range uncertainty recipes used at several proton therapy facilities are revisited. We conclude that a significant impact of Monte Carlo dose calculation can be expected in complex geometries where local range uncertainties due to multiple Coulomb scattering will reduce the accuracy of analytical algorithms. In these cases Monte Carlo techniques might reduce the range uncertainty by several mm. (topical review)
Bukharov, Dmitriy; Aleksey, Kucherik; Tatyana, Trifonova
2014-05-01
Recently, the contribution of groundwater in catastrophic floods is the question under discussion [1,2]. The principal problem in such an approach - to analyze the transportation ways for groundwater in dynamics, and especially - the reasons of exit it on land surface. The crackness, being a characteristic property for all rocks, should be associated with the process in respect of unified dynamic system as a river water basin is, taking into account fundamental phenomena of the 3D-crack network development/modification (up to faults) as a transport groundwater system [3]. 2. In the system of fractal cracks (connected with the main channel for groundwater) the formation of extreme flow is possible, i.e. a devastating case occurs by instantaneous flash mechanism. The development of such a process is related to two factors. First, within the main channel of propagation of the groundwater when a motion is turbulent. In accordance with the theory of Kolmogorov [4], we assume that such a turbulence is isotropic. The fact means that both velocity and pressure fields in the water flow have pulsations related to the non-linear energy transfer between the vortices. This approach allows us to determine both that a maximum possible size of the vortices defined by characteristic dimensions of the underground channel and another - a minimum size of their due to process of dissipation. Energy transfer in the eddies formed near a border, is a complex nonlinear process, which we described by using a modernized Prandtl semi-empirical model [5]. Second, the mechanism of groundwater propagation in the system of cracks extending from the main underground channel is described in the frames of the fractal geometry methods [6]. The approach allows to determine the degree of similarity in the crack system, i.e. the ratio of mean diameters and lengths of cracks/faults for each step of decomposition. The fact results in integrated quantitative characteristics of 3D-network in all, by fractal
Uncertainty in simulated groundwater-quality trends in transient flow
Starn, J. Jeffrey; Bagtzoglou, Amvrossios; Robbins, Gary A.
2013-01-01
In numerical modeling of groundwater flow, the result of a given solution method is affected by the way in which transient flow conditions and geologic heterogeneity are simulated. An algorithm is demonstrated that simulates breakthrough curves at a pumping well by convolution-based particle tracking in a transient flow field for several synthetic basin-scale aquifers. In comparison to grid-based (Eulerian) methods, the particle (Lagrangian) method is better able to capture multimodal breakthrough caused by changes in pumping at the well, although the particle method may be apparently nonlinear because of the discrete nature of particle arrival times. Trial-and-error choice of number of particles and release times can perhaps overcome the apparent nonlinearity. Heterogeneous aquifer properties tend to smooth the effects of transient pumping, making it difficult to separate their effects in parameter estimation. Porosity, a new parameter added for advective transport, can be accurately estimated using both grid-based and particle-based methods, but predictions can be highly uncertain, even in the simple, nonreactive case.
Energy Technology Data Exchange (ETDEWEB)
Stewart, G.; Lackner, M.; Haid, L.; Matha, D.; Jonkman, J.; Robertson, A.
2013-07-01
With the push towards siting wind turbines farther offshore due to higher wind quality and less visibility, floating offshore wind turbines, which can be located in deep water, are becoming an economically attractive option. The International Electrotechnical Commission's (IEC) 61400-3 design standard covers fixed-bottom offshore wind turbines, but there are a number of new research questions that need to be answered to modify these standards so that they are applicable to floating wind turbines. One issue is the appropriate simulation length needed for floating turbines. This paper will discuss the results from a study assessing the impact of simulation length on the ultimate and fatigue loads of the structure, and will address uncertainties associated with changing the simulation length for the analyzed floating platform. Recommendations of required simulation length based on load uncertainty will be made and compared to current simulation length requirements.
Rajabi, Mohammad Mahdi; Ketabchi, Hamed
2017-12-01
Combined simulation-optimization (S/O) schemes have long been recognized as a valuable tool in coastal groundwater management (CGM). However, previous applications have mostly relied on deterministic seawater intrusion (SWI) simulations. This is a questionable simplification, knowing that SWI models are inevitably prone to epistemic and aleatory uncertainty, and hence a management strategy obtained through S/O without consideration of uncertainty may result in significantly different real-world outcomes than expected. However, two key issues have hindered the use of uncertainty-based S/O schemes in CGM, which are addressed in this paper. The first issue is how to solve the computational challenges resulting from the need to perform massive numbers of simulations. The second issue is how the management problem is formulated in presence of uncertainty. We propose the use of Gaussian process (GP) emulation as a valuable tool in solving the computational challenges of uncertainty-based S/O in CGM. We apply GP emulation to the case study of Kish Island (located in the Persian Gulf) using an uncertainty-based S/O algorithm which relies on continuous ant colony optimization and Monte Carlo simulation. In doing so, we show that GP emulation can provide an acceptable level of accuracy, with no bias and low statistical dispersion, while tremendously reducing the computational time. Moreover, five new formulations for uncertainty-based S/O are presented based on concepts such as energy distances, prediction intervals and probabilities of SWI occurrence. We analyze the proposed formulations with respect to their resulting optimized solutions, the sensitivity of the solutions to the intended reliability levels, and the variations resulting from repeated optimization runs.
Directory of Open Access Journals (Sweden)
Daniel Cancelli Romero
2017-10-01
Full Text Available ABSTRACT Analytical results are widely used to assess batch-by-batch conformity, pharmaceutical equivalence, as well as in the development of drug products. Despite this, few papers describing the measurement uncertainty estimation associated with these results were found in the literature. Here, we described a simple procedure used for estimating measurement uncertainty associated with the dissolution test of acetaminophen tablets. A fractionate factorial design was used to define a mathematical model that explains the amount of acetaminophen dissolved (% as a function of time of dissolution (from 20 to 40 minutes, volume of dissolution media (from 800 to 1000 mL, pH of dissolution media (from 2.0 to 6.8, and rotation speed (from 40 to 60 rpm. Using Monte Carlo simulations, we estimated measurement uncertainty for dissolution test of acetaminophen tablets (95.2 ± 1.0%, with a 95% confidence level. Rotation speed was the most important source of uncertainty, contributing about 96.2% of overall uncertainty. Finally, it is important to note that the uncertainty calculated in this paper reflects the expected uncertainty to the dissolution test, and does not consider variations in the content of acetaminophen.
Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish
2018-06-01
Every model to characterise a real world process is affected by uncertainty. Selecting a suitable model is a vital aspect of engineering planning and design. Observation or input errors make the prediction of modelled responses more uncertain. By way of a recently developed attribution metric, this study is aimed at developing a method for analysing variability in model inputs together with model structure variability to quantify their relative contributions in typical hydrological modelling applications. The Quantile Flow Deviation (QFD) metric is used to assess these alternate sources of uncertainty. The Australian Water Availability Project (AWAP) precipitation data for four different Australian catchments is used to analyse the impact of spatial rainfall variability on simulated streamflow variability via the QFD. The QFD metric attributes the variability in flow ensembles to uncertainty associated with the selection of a model structure and input time series. For the case study catchments, the relative contribution of input uncertainty due to rainfall is higher than that due to potential evapotranspiration, and overall input uncertainty is significant compared to model structure and parameter uncertainty. Overall, this study investigates the propagation of input uncertainty in a daily streamflow modelling scenario and demonstrates how input errors manifest across different streamflow magnitudes.
Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai
2017-10-01
Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.
Gutmann, E. D.; Ikeda, K.; Deser, C.; Rasmussen, R.; Clark, M. P.; Arnold, J. R.
2015-12-01
The uncertainty in future climate predictions is as large or larger than the mean climate change signal. As such, any predictions of future climate need to incorporate and quantify the sources of this uncertainty. One of the largest sources comes from the internal, chaotic, variability within the climate system itself. This variability has been approximated using the 30 ensemble members of the Community Earth System Model (CESM) large ensemble. Here we examine the wet and dry end members of this ensemble for cool-season precipitation in the Colorado Rocky Mountains with a set of high-resolution regional climate model simulations. We have used the Weather Research and Forecasting model (WRF) to simulate the periods 1990-2000, 2025-2035, and 2070-2080 on a 4km grid. These simulations show that the broad patterns of change depicted in CESM are inherited by the high-resolution simulations; however, the differences in the height and location of the mountains in the WRF simulation, relative to the CESM simulation, means that the location and magnitude of the precipitation changes are very different. We further show that high-resolution simulations with the Intermediate Complexity Atmospheric Research model (ICAR) predict a similar spatial pattern in the change signal as WRF for these ensemble members. We then use ICAR to examine the rest of the CESM Large Ensemble as well as the uncertainty in the regional climate model due to the choice of physics parameterizations.
Michel, Patrick; Richardson, D. C.
2007-10-01
We have made major improvements in simulations of asteroid disruption by computing explicitly aggregate formations during the gravitational reaccumulation of small fragments, allowing us to obtain information on their spin and shape. First results will be presented taking as examples asteroid families that we reproduced successfully with previous less sophisticated simulations. In the last years, we have simulated successfully the formation of asteroid families using a SPH hydrocode to compute the fragmentation following the impact of a projectile on the parent body, and the N-body code pkdgrav to compute the mutual interactions of the fragments. We found that fragments generated by the disruption of a km-size asteroid can have large enough masses to be attracted by each other during their ejection. Consequently, many reaccumulations take place. Eventually most large fragments correspond to gravitational aggregates formed by reaccumulation of smaller ones. Moreover, formation of satellites occurs around the largest and other big remnants. In these previous simulations, when fragments reaccumulate, they merge into a single sphere whose mass is the sum of their masses. Thus, no information is obtained on the actual shape of the aggregates, their spin, ... For the first time, we have now simulated the disruption of a family parent body by computing explicitly the formation of aggregates, along with the above-mentioned properties. Once formed these aggregates can interact and/or collide with each other and break up during their evolution. We will present these first simulations and their possible implications on properties of asteroids generated by disruption. Results can for instance be compared with data provided by the Japanese space mission Hayabusa of the asteroid Itokawa, a body now understood to be a reaccumulated fragment from a larger parent body. Acknowledgments: PM and DCR acknowledge supports from the French Programme National de Planétologie and grants
The magnitude and causes of uncertainty in global model simulations of cloud condensation nuclei
Directory of Open Access Journals (Sweden)
L. A. Lee
2013-09-01
Full Text Available Aerosol–cloud interaction effects are a major source of uncertainty in climate models so it is important to quantify the sources of uncertainty and thereby direct research efforts. However, the computational expense of global aerosol models has prevented a full statistical analysis of their outputs. Here we perform a variance-based analysis of a global 3-D aerosol microphysics model to quantify the magnitude and leading causes of parametric uncertainty in model-estimated present-day concentrations of cloud condensation nuclei (CCN. Twenty-eight model parameters covering essentially all important aerosol processes, emissions and representation of aerosol size distributions were defined based on expert elicitation. An uncertainty analysis was then performed based on a Monte Carlo-type sampling of an emulator built for each model grid cell. The standard deviation around the mean CCN varies globally between about ±30% over some marine regions to ±40–100% over most land areas and high latitudes, implying that aerosol processes and emissions are likely to be a significant source of uncertainty in model simulations of aerosol–cloud effects on climate. Among the most important contributors to CCN uncertainty are the sizes of emitted primary particles, including carbonaceous combustion particles from wildfires, biomass burning and fossil fuel use, as well as sulfate particles formed on sub-grid scales. Emissions of carbonaceous combustion particles affect CCN uncertainty more than sulfur emissions. Aerosol emission-related parameters dominate the uncertainty close to sources, while uncertainty in aerosol microphysical processes becomes increasingly important in remote regions, being dominated by deposition and aerosol sulfate formation during cloud-processing. The results lead to several recommendations for research that would result in improved modelling of cloud–active aerosol on a global scale.
Arnaud, Patrick; Cantet, Philippe; Odry, Jean
2017-11-01
Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with
DEFF Research Database (Denmark)
Kublitz, Anja
2015-01-01
as camps. Based on fieldwork among Palestinians in the Danish camps, this article explores why my interlocutors describe their current lives as a catastrophe. Al-Nakba literally means the catastrophe and, in Palestinian national discourse, it is used to designate the event of 1948, when the Palestinians...
Assessing the Uncertainty of Tropical Cyclone Simulations in NCAR's Community Atmosphere Model
Directory of Open Access Journals (Sweden)
Kevin A Reed
2011-08-01
Full Text Available The paper explores the impact of the initial-data, parameter and structural model uncertainty on the simulation of a tropical cyclone-like vortex in the National Center for Atmospheric Research's (NCAR Community Atmosphere Model (CAM. An analytic technique is used to initialize the model with an idealized weak vortex that develops into a tropical cyclone over ten simulation days. A total of 78 ensemble simulations are performed at horizontal grid spacings of 1.0°, 0.5° and 0.25° using two recently released versions of the model, CAM 4 and CAM 5. The ensemble members represent simulations with random small-amplitude perturbations of the initial conditions, small shifts in the longitudinal position of the initial vortex and runs with slightly altered model parameters. The main distinction between CAM 4 and CAM 5 lies within the physical parameterization suite, and the simulations with both CAM versions at the varying resolutions assess the structural model uncertainty. At all resolutions storms are produced with many tropical cyclone-like characteristics. The CAM 5 simulations exhibit more intense storms than CAM 4 by day 10 at the 0.5° and 0.25° grid spacings, while the CAM 4 storm at 1.0° is stronger. There are also distinct differences in the shapes and vertical profiles of the storms in the two variants of CAM. The ensemble members show no distinction between the initial-data and parameter uncertainty simulations. At day 10 they produce ensemble root-mean-square deviations from an unperturbed control simulation on the order of 1--5 m s^{-1} for the maximum low-level wind speed and 2--10 hPa for the minimum surface pressure. However, there are large differences between the two CAM versions at identical horizontal resolutions. It suggests that the structural uncertainty is more dominant than the initial-data and parameter uncertainties in this study. The uncertainty among the ensemble members is assessed and quantified.
Evaluating uncertainties in regional climate simulations over South America at the seasonal scale
Energy Technology Data Exchange (ETDEWEB)
Solman, Silvina A. [Centro de Investigaciones del Mar y la Atmosfera CIMA/CONICET-UBA, DCAO/FCEN, UMI-IFAECI/CNRS, CIMA-Ciudad Universitaria, Buenos Aires (Argentina); Pessacg, Natalia L. [Centro Nacional Patagonico (CONICET), Puerto Madryn, Chubut (Argentina)
2012-07-15
This work focuses on the evaluation of different sources of uncertainty affecting regional climate simulations over South America at the seasonal scale, using the MM5 model. The simulations cover a 3-month period for the austral spring season. Several four-member ensembles were performed in order to quantify the uncertainty due to: the internal variability; the definition of the regional model domain; the choice of physical parameterizations and the selection of physical parameters within a particular cumulus scheme. The uncertainty was measured by means of the spread among individual members of each ensemble during the integration period. Results show that the internal variability, triggered by differences in the initial conditions, represents the lowest level of uncertainty for every variable analyzed. The geographic distribution of the spread among ensemble members depends on the variable: for precipitation and temperature the largest spread is found over tropical South America while for the mean sea level pressure the largest spread is located over the southeastern Atlantic Ocean, where large synoptic-scale activity occurs. Using nudging techniques to ingest the boundary conditions reduces dramatically the internal variability. The uncertainty due to the domain choice displays a similar spatial pattern compared with the internal variability, except for the mean sea level pressure field, though its magnitude is larger all over the model domain for every variable. The largest spread among ensemble members is found for the ensemble in which different combinations of physical parameterizations are selected. The perturbed physics ensemble produces a level of uncertainty slightly larger than the internal variability. This study suggests that no matter what the source of uncertainty is, the geographical distribution of the spread among members of the ensembles is invariant, particularly for precipitation and temperature. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Reeve, Samuel Temple; Strachan, Alejandro, E-mail: strachan@purdue.edu
2017-04-01
We use functional, Fréchet, derivatives to quantify how thermodynamic outputs of a molecular dynamics (MD) simulation depend on the potential used to compute atomic interactions. Our approach quantifies the sensitivity of the quantities of interest with respect to the input functions as opposed to its parameters as is done in typical uncertainty quantification methods. We show that the functional sensitivity of the average potential energy and pressure in isothermal, isochoric MD simulations using Lennard–Jones two-body interactions can be used to accurately predict those properties for other interatomic potentials (with different functional forms) without re-running the simulations. This is demonstrated under three different thermodynamic conditions, namely a crystal at room temperature, a liquid at ambient pressure, and a high pressure liquid. The method provides accurate predictions as long as the change in potential can be reasonably described to first order and does not significantly affect the region in phase space explored by the simulation. The functional uncertainty quantification approach can be used to estimate the uncertainties associated with constitutive models used in the simulation and to correct predictions if a more accurate representation becomes available.
Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison
2017-11-01
Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.
Advanced Approach to Consider Aleatory and Epistemic Uncertainties for Integral Accident Simulations
International Nuclear Information System (INIS)
Peschke, Joerg; Kloos, Martina
2013-01-01
The use of best-estimate codes together with realistic input data generally requires that all potentially important epistemic uncertainties which may affect the code prediction are considered in order to get an adequate quantification of the epistemic uncertainty of the prediction as an expression of the existing imprecise knowledge. To facilitate the performance of the required epistemic uncertainty analyses, methods and corresponding software tools are available like, for instance, the GRS-tool SUSA (Software for Uncertainty and Sensitivity Analysis). However, for risk-informed decision-making, the restriction on epistemic uncertainties alone is not enough. Transients and accident scenarios are also affected by aleatory uncertainties which are due to the unpredictable nature of phenomena. It is essential that aleatory uncertainties are taken into account as well, not only in a simplified and supposedly conservative way but as realistic as possible. The additional consideration of aleatory uncertainties, for instance, on the behavior of the technical system, the performance of plant operators, or on the behavior of the physical process provides a quantification of probabilistically significant accident sequences. Only if a safety analysis is able to account for both epistemic and aleatory uncertainties in a realistic manner, it can provide a well-founded risk-informed answer for decision-making. At GRS, an advanced probabilistic dynamics method was developed to address this problem and to provide a more realistic modeling and assessment of transients and accident scenarios. This method allows for an integral simulation of complex dynamic processes particularly taking into account interactions between the plant dynamics as simulated by a best-estimate code, the dynamics of operator actions and the influence of epistemic and aleatory uncertainties. In this paper, the GRS method MCDET (Monte Carlo Dynamic Event Tree) for probabilistic dynamics analysis is explained
Energy Technology Data Exchange (ETDEWEB)
Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-07-18
We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.
Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows
Energy Technology Data Exchange (ETDEWEB)
Templeton, Jeremy Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Blaylock, Myra L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Domino, Stefan P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hewson, John C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kumar, Pritvi Raj [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ling, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Najm, Habib N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ruiz, Anthony [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stewart, Alessia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wagner, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-09-01
The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.
International Nuclear Information System (INIS)
Juang, Kai-Wei; Chen, Yue-Shin; Lee, Dar-Yuan
2004-01-01
Mapping the spatial distribution of soil pollutants is essential for delineating contaminated areas. Currently, geostatistical interpolation, kriging, is increasingly used to estimate pollutant concentrations in soils. The kriging-based approach, indicator kriging (IK), may be used to model the uncertainty of mapping. However, a smoothing effect is usually produced when using kriging in pollutant mapping. The detailed spatial patterns of pollutants could, therefore, be lost. The local uncertainty of mapping pollutants derived by the IK technique is referred to as the conditional cumulative distribution function (ccdf) for one specific location (i.e. single-location uncertainty). The local uncertainty information obtained by IK is not sufficient as the uncertainty of mapping at several locations simultaneously (i.e. multi-location uncertainty or spatial uncertainty) is required to assess the reliability of the delineation of contaminated areas. The simulation approach, sequential indicator simulation (SIS), which has the ability to model not only single, but also multi-location uncertainties, was used, in this study, to assess the uncertainty of the delineation of heavy metal contaminated soils. To illustrate this, a data set of Cu concentrations in soil from Taiwan was used. The results show that contour maps of Cu concentrations generated by the SIS realizations exhausted all the spatial patterns of Cu concentrations without the smoothing effect found when using the kriging method. Based on the SIS realizations, the local uncertainty of Cu concentrations at a specific location of x', refers to the probability of the Cu concentration z(x') being higher than the defined threshold level of contamination (z c ). This can be written as Prob SIS [z(x')>z c ], representing the probability of contamination. The probability map of Prob SIS [z(x')>z c ] can then be used for delineating contaminated areas. In addition, the multi-location uncertainty of an area A
Evaluation and uncertainties of global climate models as simulated in East Asia and China
International Nuclear Information System (INIS)
Zhao, Z.C.
1994-01-01
The assessments and uncertainties of the general circulation models (GCMs) as simulated in East Asia and China (15-60 N, 70-140 E) have been investigated by using seven GCMs. Four methods of assessment have been chosen. The variables for the validations for the GCMs include the annual, seasonal and monthly mean temperatures and precipitation. The assessments indicated that: (1) the simulations of seven GCMs for temperature are much better than those for precipitation; (2) the simulations in winter are much better than those in summer; (3) the simulations in eastern parts are much better than those in Western parts for both temperature and precipitation; (4) the best GCM for simulated temperature is the GISS model, and the best GCM for simulated precipitation is the UKMO-H model. The seven GCMs' means for both simulated temperature and precipitation provided good results. The range of uncertainties in East Asia and China due to human activities are presented. The differences between the GCMs for temperature and precipitation before the year 2050 are much smaller than those after the year 2050
Wasklewicz, Thad; Zhu, Zhen; Gares, Paul
2017-12-01
Rapid technological advances, sustained funding, and a greater recognition of the value of topographic data have helped develop an increasing archive of topographic data sources. Advances in basic and applied research related to Earth surface changes require researchers to integrate recent high-resolution topography (HRT) data with the legacy datasets. Several technical challenges and data uncertainty issues persist to date when integrating legacy datasets with more recent HRT data. The disparate data sources required to extend the topographic record back in time are often stored in formats that are not readily compatible with more recent HRT data. Legacy data may also contain unknown error or unreported error that make accounting for data uncertainty difficult. There are also cases of known deficiencies in legacy datasets, which can significantly bias results. Finally, scientists are faced with the daunting challenge of definitively deriving the extent to which a landform or landscape has or will continue to change in response natural and/or anthropogenic processes. Here, we examine the question: how do we evaluate and portray data uncertainty from the varied topographic legacy sources and combine this uncertainty with current spatial data collection techniques to detect meaningful topographic changes? We view topographic uncertainty as a stochastic process that takes into consideration spatial and temporal variations from a numerical simulation and physical modeling experiment. The numerical simulation incorporates numerous topographic data sources typically found across a range of legacy data to present high-resolution data, while the physical model focuses on more recent HRT data acquisition techniques. Elevation uncertainties observed from anchor points in the digital terrain models are modeled using "states" in a stochastic estimator. Stochastic estimators trace the temporal evolution of the uncertainties and are natively capable of incorporating sensor
Xue, Lianqing; Yang, Fan; Yang, Changbing; Wei, Guanghui; Li, Wenqian; He, Xinlin
2018-01-11
Understanding the mechanism of complicated hydrological processes is important for sustainable management of water resources in an arid area. This paper carried out the simulations of water movement for the Manas River Basin (MRB) using the improved semi-distributed Topographic hydrologic model (TOPMODEL) with a snowmelt model and topographic index algorithm. A new algorithm is proposed to calculate the curve of topographic index using internal tangent circle on a conical surface. Based on the traditional model, the improved indicator of temperature considered solar radiation is used to calculate the amount of snowmelt. The uncertainty of parameters for the TOPMODEL model was analyzed using the generalized likelihood uncertainty estimation (GLUE) method. The proposed model shows that the distribution of the topographic index is concentrated in high mountains, and the accuracy of runoff simulation has certain enhancement by considering radiation. Our results revealed that the performance of the improved TOPMODEL is acceptable and comparable to runoff simulation in the MRB. The uncertainty of the simulations resulted from the parameters and structures of model, climatic and anthropogenic factors. This study is expected to serve as a valuable complement for widely application of TOPMODEL and identify the mechanism of hydrological processes in arid area.
Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5
International Nuclear Information System (INIS)
Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose; Ortiz, J.; Pereira, Claubia
2013-01-01
A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)
Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5
Energy Technology Data Exchange (ETDEWEB)
Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose, E-mail: sergalbe@upv.es [Universitat Politecnica de Valencia, Valencia, (Spain). Instituto de Seguridad Industrial, Radiofisica y Medioambiental (ISIRYM); Ortiz, J. [Universitat Politecnica de Valencia, Valencia, (Spain). Servicio de Radiaciones. Lab. de Radiactividad Ambiental; Pereira, Claubia [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear
2013-07-01
A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)
Ahmadalipour, Ali; Moradkhani, Hamid
2017-12-01
Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.
Bermejo-Moreno, Ivan; Campo, Laura; Larsson, Johan; Emory, Mike; Bodart, Julien; Palacios, Francisco; Iaccarino, Gianluca; Eaton, John
2013-11-01
We study the interaction between an oblique shock wave and the turbulent boundary layers inside a nearly-square duct by combining wall-modeled LES, 2D and 3D RANS simulations, targeting the experiment of Campo, Helmer & Eaton, 2012 (nominal conditions: M = 2 . 05 , Reθ = 6 , 500). A primary objective is to quantify the effect of aleatory and epistemic uncertainties on the STBLI. Aleatory uncertainties considered include the inflow conditions (Mach number of the incoming air stream and thickness of the boundary layers) and perturbations of the duct geometry upstream of the interaction. The epistemic uncertainty under consideration focuses on the RANS turbulence model form by injecting perturbations in the Reynolds stress anisotropy in regions of the flow where the model assumptions (in particular, the Boussinesq eddy-viscosity hypothesis) may be invalid. These perturbations are then propagated through the flow solver into the solution. The uncertainty quantification (UQ) analysis is done through 2D and 3D RANS simulations, assessing the importance of the three-dimensional effects imposed by the nearly-square duct geometry. Wall-modeled LES are used to verify elements of the UQ methodology and to explore the flow features and physics of the STBLI for multiple shock strengths. Financial support from the United States Department of Energy under the PSAAP program is gratefully acknowledged.
DEFF Research Database (Denmark)
He, Xiulan
parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...... be compensated by model parameters, e.g. when hydraulic heads are considered. However, geological structure is the primary source of uncertainty with respect to simulations of groundwater age and capture zone. Operational MPS based software has been on stage for just around ten years; yet, issues regarding...... geological structures of these three sites provided appropriate conditions for testing the methods. Our study documented that MPS is an efficient approach for simulating geological heterogeneity, especially for non-stationary system. The high resolution of geophysical data such as SkyTEM is valuable both...
Tompkins, A. M.; Thomson, M. C.
2017-12-01
Simulations of the impact of climate variations on a vector-bornedisease such as malaria are subject to a number of sources ofuncertainty. These include the model structure and parameter settingsin addition to errors in the climate data and the neglect of theirspatial heterogeneity, especially over complex terrain. We use aconstrained genetic algorithm to confront these two sources ofuncertainty for malaria transmission in the highlands of Kenya. Thetechnique calibrates the parameter settings of a process-based,mathematical model of malaria transmission to vary within theirassessed level of uncertainty and also allows the calibration of thedriving climate data. The simulations show that in highland settingsclose to the threshold for sustained transmission, the uncertainty inclimate is more important to address than the malaria modeluncertainty. Applications of the coupled climate-malaria modelling system are briefly presented.
Catastrophic Antiphospholipid Syndrome
Directory of Open Access Journals (Sweden)
Rawhya R. El-Shereef
2016-01-01
Full Text Available This paper reports one case of successfully treated patients suffering from a rare entity, the catastrophic antiphospholipid syndrome (CAPS. Management of this patient is discussed in detail.
Catastrophe Theory and Caustics
DEFF Research Database (Denmark)
Gravesen, Jens
1983-01-01
It is shown by elementary methods that in codimension two and under the assumption that light rays are straight lines, a caustic is the catastrophe set for a time function. The general case is also discussed....
2015-12-02
of completely new nonlinear Malliavin calculus . This type of calculus is important for the analysis and simulation of stationary and/or “causal...been limited by the fact that it requires the solution of an optimization problem with noisy gradients . When using deterministic optimization schemes...under uncertainty. We tested new developments on nonlinear Malliavin calculus , combining reduced basis methods with ANOVA, model validation, on
Dolan, B.; Rutledge, S. A.; Barnum, J. I.; Matsui, T.; Tao, W. K.; Iguchi, T.
2017-12-01
POLarimetric Radar Retrieval and Instrument Simulator (POLARRIS) is a framework that has been developed to simulate radar observations from cloud resolving model (CRM) output and subject model data and observations to the same retrievals, analysis and visualization. This framework not only enables validation of bulk microphysical model simulated properties, but also offers an opportunity to study the uncertainties associated with retrievals such as hydrometeor classification (HID). For the CSU HID, membership beta functions (MBFs) are built using a set of simulations with realistic microphysical assumptions about axis ratio, density, canting angles, size distributions for each of ten hydrometeor species. These assumptions are tested using POLARRIS to understand their influence on the resulting simulated polarimetric data and final HID classification. Several of these parameters (density, size distributions) are set by the model microphysics, and therefore the specific assumptions of axis ratio and canting angle are carefully studied. Through these sensitivity studies, we hope to be able to provide uncertainties in retrieved polarimetric variables and HID as applied to CRM output. HID retrievals assign a classification to each point by determining the highest score, thereby identifying the dominant hydrometeor type within a volume. However, in nature, there is rarely just one a single hydrometeor type at a particular point. Models allow for mixing ratios of different hydrometeors within a grid point. We use the mixing ratios from CRM output in concert with the HID scores and classifications to understand how the HID algorithm can provide information about mixtures within a volume, as well as calculate a confidence in the classifications. We leverage the POLARRIS framework to additionally probe radar wavelength differences toward the possibility of a multi-wavelength HID which could utilize the strengths of different wavelengths to improve HID classifications. With
International Nuclear Information System (INIS)
Karanki, D.R.; Rahman, S.; Dang, V.N.; Zerkak, O.
2017-01-01
The coupling of plant simulation models and stochastic models representing failure events in Dynamic Event Trees (DET) is a framework used to model the dynamic interactions among physical processes, equipment failures, and operator responses. The integration of physical and stochastic models may additionally enhance the treatment of uncertainties. Probabilistic Safety Assessments as currently implemented propagate the (epistemic) uncertainties in failure probabilities, rates, and frequencies; while the uncertainties in the physical model (parameters) are not propagated. The coupling of deterministic (physical) and probabilistic models in integrated simulations such as DET allows both types of uncertainties to be considered. However, integrated accident simulations with epistemic uncertainties will challenge even today's high performance computing infrastructure, especially for simulations of inherently complex nuclear or chemical plants. Conversely, intentionally limiting computations for practical reasons would compromise accuracy of results. This work investigates how to tradeoff accuracy and computations to quantify risk in light of both uncertainties and accident dynamics. A simple depleting tank problem that can be solved analytically is considered to examine the adequacy of a discrete DET approach. The results show that optimal allocation of computational resources between epistemic and aleatory calculations by means of convergence studies ensures accuracy within a limited budget. - Highlights: • Accident simulations considering uncertainties require intensive computations. • Tradeoff between accuracy and accident simulations is a challenge. • Optimal allocation between epistemic & aleatory computations ensures the tradeoff. • Online convergence gives an early indication of computational requirements. • Uncertainty propagation in DDET is examined on a tank problem solved analytically.
Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS
International Nuclear Information System (INIS)
Brown, C.S.; Zhang, Hongbin
2016-01-01
VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.
Toschi, Nicola; Keck, Martin E; Welt, Tobias; Guerrisi, Maria
2012-01-01
Transcranial Magnetic Stimulation offers enormous potential for noninvasive brain stimulation. While it is known that brain tissue significantly "reshapes" induced field and charge distributions, most modeling investigations to-date have focused on single-subject data with limited generality. Further, the effects of the significant uncertainties which exist in the simulation (i.e. brain conductivity distributions) and stimulation (e.g. coil positioning and orientations) setup have not been quantified. In this study, we construct a high-resolution anisotropic head model in standard ICBM space, which can be used as a population-representative standard for bioelectromagnetic simulations. Further, we employ Monte-Carlo simulations in order to quantify how uncertainties in conductivity values propagate all the way to induced field and currents, demonstrating significant, regionally dependent dispersions in values which are commonly assumed "ground truth". This framework can be leveraged in order to quantify the effect of any type of uncertainty in noninvasive brain stimulation and bears relevance in all applications of TMS, both investigative and therapeutic.
Varouchakis, Emmanouil; Hristopulos, Dionissios
2015-04-01
Space-time geostatistical approaches can improve the reliability of dynamic groundwater level models in areas with limited spatial and temporal data. Space-time residual Kriging (STRK) is a reliable method for spatiotemporal interpolation that can incorporate auxiliary information. The method usually leads to an underestimation of the prediction uncertainty. The uncertainty of spatiotemporal models is usually estimated by determining the space-time Kriging variance or by means of cross validation analysis. For de-trended data the former is not usually applied when complex spatiotemporal trend functions are assigned. A Bayesian approach based on the bootstrap idea and sequential Gaussian simulation are employed to determine the uncertainty of the spatiotemporal model (trend and covariance) parameters. These stochastic modelling approaches produce multiple realizations, rank the prediction results on the basis of specified criteria and capture the range of the uncertainty. The correlation of the spatiotemporal residuals is modeled using a non-separable space-time variogram based on the Spartan covariance family (Hristopulos and Elogne 2007, Varouchakis and Hristopulos 2013). We apply these simulation methods to investigate the uncertainty of groundwater level variations. The available dataset consists of bi-annual (dry and wet hydrological period) groundwater level measurements in 15 monitoring locations for the time period 1981 to 2010. The space-time trend function is approximated using a physical law that governs the groundwater flow in the aquifer in the presence of pumping. The main objective of this research is to compare the performance of two simulation methods for prediction uncertainty estimation. In addition, we investigate the performance of the Spartan spatiotemporal covariance function for spatiotemporal geostatistical analysis. Hristopulos, D.T. and Elogne, S.N. 2007. Analytic properties and covariance functions for a new class of generalized Gibbs
Value at risk (VaR in uncertainty: Analysis with parametric method and black & scholes simulations
Directory of Open Access Journals (Sweden)
Humberto Banda Ortiz
2014-07-01
Full Text Available VaR is the most accepted risk measure worldwide and the leading reference in any risk management assessment. However, its methodology has important limitations which makes it unreliable in contexts of crisis or high uncertainty. For this reason, the aim of this work is to test the VaR accuracy when is employed in contexts of volatility, for which we compare the VaR outcomes in scenarios of both stability and uncertainty, using the parametric method and a historical simulation based on data generated with the Black & Scholes model. VaR main objective is the prediction of the highest expected loss for any given portfolio, but even when it is considered a useful tool for risk management under conditions of markets stability, we found that it is substantially inaccurate in contexts of crisis or high uncertainty. In addition, we found that the Black & Scholes simulations lead to underestimate the expected losses, in comparison with the parametric method and we also found that those disparities increase substantially in times of crisis. In the first section of this work we present a brief context of risk management in finance. In section II we present the existent literature relative to the VaR concept, its methods and applications. In section III we describe the methodology and assumptions used in this work. Section IV is dedicated to expose the findings. And finally, in Section V we present our conclusions.
Jie, M.; Zhang, J.; Guo, B. B.
2017-12-01
As a typical distributed hydrological model, the SWAT model also has a challenge in calibrating parameters and analysis their uncertainty. This paper chooses the Chaohe River Basin China as the study area, through the establishment of the SWAT model, loading the DEM data of the Chaohe river basin, the watershed is automatically divided into several sub-basins. Analyzing the land use, soil and slope which are on the basis of the sub-basins and calculating the hydrological response unit (HRU) of the study area, after running SWAT model, the runoff simulation values in the watershed are obtained. On this basis, using weather data, known daily runoff of three hydrological stations, combined with the SWAT-CUP automatic program and the manual adjustment method are used to analyze the multi-site calibration of the model parameters. Furthermore, the GLUE algorithm is used to analyze the parameters uncertainty of the SWAT model. Through the sensitivity analysis, calibration and uncertainty study of SWAT, the results indicate that the parameterization of the hydrological characteristics of the Chaohe river is successful and feasible which can be used to simulate the Chaohe river basin.
Qi, W.; Zhang, C.; Fu, G.; Sweetapple, C.; Zhou, H.
2016-02-01
The applicability of six fine-resolution precipitation products, including precipitation radar, infrared, microwave and gauge-based products, using different precipitation computation recipes, is evaluated using statistical and hydrological methods in northeastern China. In addition, a framework quantifying uncertainty contributions of precipitation products, hydrological models, and their interactions to uncertainties in ensemble discharges is proposed. The investigated precipitation products are Tropical Rainfall Measuring Mission (TRMM) products (TRMM3B42 and TRMM3B42RT), Global Land Data Assimilation System (GLDAS)/Noah, Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and a Global Satellite Mapping of Precipitation (GSMAP-MVK+) product. Two hydrological models of different complexities, i.e. a water and energy budget-based distributed hydrological model and a physically based semi-distributed hydrological model, are employed to investigate the influence of hydrological models on simulated discharges. Results show APHRODITE has high accuracy at a monthly scale compared with other products, and GSMAP-MVK+ shows huge advantage and is better than TRMM3B42 in relative bias (RB), Nash-Sutcliffe coefficient of efficiency (NSE), root mean square error (RMSE), correlation coefficient (CC), false alarm ratio, and critical success index. These findings could be very useful for validation, refinement, and future development of satellite-based products (e.g. NASA Global Precipitation Measurement). Although large uncertainty exists in heavy precipitation, hydrological models contribute most of the uncertainty in extreme discharges. Interactions between precipitation products and hydrological models can have the similar magnitude of contribution to discharge uncertainty as the hydrological models. A
Swinburne, Thomas D.; Perez, Danny
2018-05-01
A massively parallel method to build large transition rate matrices from temperature-accelerated molecular dynamics trajectories is presented. Bayesian Markov model analysis is used to estimate the expected residence time in the known state space, providing crucial uncertainty quantification for higher-scale simulation schemes such as kinetic Monte Carlo or cluster dynamics. The estimators are additionally used to optimize where exploration is performed and the degree of temperature acceleration on the fly, giving an autonomous, optimal procedure to explore the state space of complex systems. The method is tested against exactly solvable models and used to explore the dynamics of C15 interstitial defects in iron. Our uncertainty quantification scheme allows for accurate modeling of the evolution of these defects over timescales of several seconds.
Chiadamrong, N.; Piyathanavong, V.
2017-12-01
Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.
Stand-alone core sensitivity and uncertainty analysis of ALFRED from Monte Carlo simulations
International Nuclear Information System (INIS)
Pérez-Valseca, A.-D.; Espinosa-Paredes, G.; François, J.L.; Vázquez Rodríguez, A.; Martín-del-Campo, C.
2017-01-01
Highlights: • Methodology based on Monte Carlo simulation. • Sensitivity analysis of Lead Fast Reactor (LFR). • Uncertainty and regression analysis of LFR. • 10% change in the core inlet flow, the response in thermal power change is 0.58%. • 2.5% change in the inlet lead temperature the response is 1.87% in power. - Abstract: The aim of this paper is the sensitivity and uncertainty analysis of a Lead-Cooled Fast Reactor (LFR) based on Monte Carlo simulation of sizes up to 2000. The methodology developed in this work considers the uncertainty of sensitivities and uncertainty of output variables due to a single-input-variable variation. The Advanced Lead fast Reactor European Demonstrator (ALFRED) is analyzed to determine the behavior of the essential parameters due to effects of mass flow and temperature of liquid lead. The ALFRED core mathematical model developed in this work is fully transient, which takes into account the heat transfer in an annular fuel pellet design, the thermo-fluid in the core, and the neutronic processes, which are modeled with point kinetic with feedback fuel temperature and expansion effects. The sensitivity evaluated in terms of the relative standard deviation (RSD) showed that for 10% change in the core inlet flow, the response in thermal power change is 0.58%, and for 2.5% change in the inlet lead temperature is 1.87%. The regression analysis with mass flow rate as the predictor variable showed statistically valid cubic correlations for neutron flux and linear relationship neutron flux as a function of the lead temperature. No statistically valid correlation was observed for the reactivity as a function of the mass flow rate and for the lead temperature. These correlations are useful for the study, analysis, and design of any LFR.
Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu
2005-01-01
Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...
Simulation of corn yields and parameters uncertainties analysis in Hebei and Sichuang, China
Fu, A.; Xue, Y.; Hartman, M. D.; Chandran, A.; Qiu, B.; Liu, Y.
2016-12-01
Corn is one of most important agricultural production in China. Research on the impacts of climate change and human activities on corn yields is important in understanding and mitigating the negative effects of environmental factors on corn yields and maintaining the stable corn production. Using climatic data, including daily temperature, precipitation, and solar radiation from 1948 to 2010, soil properties, observed corn yields, and farmland management information, corn yields in Sichuang and Hebei Provinces of China in the past 63 years were simulated using the Daycent model, and the results was evaluated using Root mean square errors, bias, simulation efficiency, and standard deviation. The primary climatic factors influencing corn yields were examined, the uncertainties of climatic factors was analyzed, and the uncertainties of human activity parameters were also studied by changing fertilization levels and cultivated ways. The results showed that: (1) Daycent model is capable to simulate corn yields in Sichuang and Hebei provinces of China. Observed and simulated corn yields have the similar increasing trend with time. (2) The minimum daily temperature is the primary factor influencing corn yields in Sichuang. In Hebei Province, daily temperature, precipitation and wind speed significantly affect corn yields.(3) When the global warming trend of original data was removed, simulated corn yields were lower than before, decreased by about 687 kg/hm2 from 1992 to 2010; When the fertilization levels, cultivated ways were increased and decreased by 50% and 75%, respectively in the Schedule file in Daycent model, the simulated corn yields increased by 1206 kg/hm2 and 776 kg/hm2, respectively, with the enhancement of fertilization level and the improvement of cultivated way. This study provides a scientific base for selecting a suitable fertilization level and cultivated way in corn fields in China.
Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations
International Nuclear Information System (INIS)
Shaukata, Nadeem; Shim, Hyung Jin
2015-01-01
In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of
Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations
Energy Technology Data Exchange (ETDEWEB)
Shaukata, Nadeem; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)
2015-10-15
In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of
Iskandarani, Mohamed; Wang, Shitao; Srinivasan, Ashwanth; Carlisle Thacker, W.; Winokur, Justin; Knio, Omar
2016-01-01
We give an overview of four different ensemble-based techniques for uncertainty quantification and illustrate their application in the context of oil plume simulations. These techniques share the common paradigm of constructing a model proxy that efficiently captures the functional dependence of the model output on uncertain model inputs. This proxy is then used to explore the space of uncertain inputs using a large number of samples, so that reliable estimates of the model's output statistics can be calculated. Three of these techniques use polynomial chaos (PC) expansions to construct the model proxy, but they differ in their approach to determining the expansions' coefficients; the fourth technique uses Gaussian Process Regression (GPR). An integral plume model for simulating the Deepwater Horizon oil-gas blowout provides examples for illustrating the different techniques. A Monte Carlo ensemble of 50,000 model simulations is used for gauging the performance of the different proxies. The examples illustrate how regression-based techniques can outperform projection-based techniques when the model output is noisy. They also demonstrate that robust uncertainty analysis can be performed at a fraction of the cost of the Monte Carlo calculation.
Iskandarani, Mohamed
2016-04-22
We give an overview of four different ensemble-based techniques for uncertainty quantification and illustrate their application in the context of oil plume simulations. These techniques share the common paradigm of constructing a model proxy that efficiently captures the functional dependence of the model output on uncertain model inputs. This proxy is then used to explore the space of uncertain inputs using a large number of samples, so that reliable estimates of the model\\'s output statistics can be calculated. Three of these techniques use polynomial chaos (PC) expansions to construct the model proxy, but they differ in their approach to determining the expansions\\' coefficients; the fourth technique uses Gaussian Process Regression (GPR). An integral plume model for simulating the Deepwater Horizon oil-gas blowout provides examples for illustrating the different techniques. A Monte Carlo ensemble of 50,000 model simulations is used for gauging the performance of the different proxies. The examples illustrate how regression-based techniques can outperform projection-based techniques when the model output is noisy. They also demonstrate that robust uncertainty analysis can be performed at a fraction of the cost of the Monte Carlo calculation.
Miller, K. L.; Berg, S. J.; Davison, J. H.; Sudicky, E. A.; Forsyth, P. A.
2018-01-01
Although high performance computers and advanced numerical methods have made the application of fully-integrated surface and subsurface flow and transport models such as HydroGeoSphere common place, run times for large complex basin models can still be on the order of days to weeks, thus, limiting the usefulness of traditional workhorse algorithms for uncertainty quantification (UQ) such as Latin Hypercube simulation (LHS) or Monte Carlo simulation (MCS), which generally require thousands of simulations to achieve an acceptable level of accuracy. In this paper we investigate non-intrusive polynomial chaos for uncertainty quantification, which in contrast to random sampling methods (e.g., LHS and MCS), represents a model response of interest as a weighted sum of polynomials over the random inputs. Once a chaos expansion has been constructed, approximating the mean, covariance, probability density function, cumulative distribution function, and other common statistics as well as local and global sensitivity measures is straightforward and computationally inexpensive, thus making PCE an attractive UQ method for hydrologic models with long run times. Our polynomial chaos implementation was validated through comparison with analytical solutions as well as solutions obtained via LHS for simple numerical problems. It was then used to quantify parametric uncertainty in a series of numerical problems with increasing complexity, including a two-dimensional fully-saturated, steady flow and transient transport problem with six uncertain parameters and one quantity of interest; a one-dimensional variably-saturated column test involving transient flow and transport, four uncertain parameters, and two quantities of interest at 101 spatial locations and five different times each (1010 total); and a three-dimensional fully-integrated surface and subsurface flow and transport problem for a small test catchment involving seven uncertain parameters and three quantities of interest at
An Indirect Simulation-Optimization Model for Determining Optimal TMDL Allocation under Uncertainty
Directory of Open Access Journals (Sweden)
Feng Zhou
2015-11-01
Full Text Available An indirect simulation-optimization model framework with enhanced computational efficiency and risk-based decision-making capability was developed to determine optimal total maximum daily load (TMDL allocation under uncertainty. To convert the traditional direct simulation-optimization model into our indirect equivalent model framework, we proposed a two-step strategy: (1 application of interval regression equations derived by a Bayesian recursive regression tree (BRRT v2 algorithm, which approximates the original hydrodynamic and water-quality simulation models and accurately quantifies the inherent nonlinear relationship between nutrient load reductions and the credible interval of algal biomass with a given confidence interval; and (2 incorporation of the calibrated interval regression equations into an uncertain optimization framework, which is further converted to our indirect equivalent framework by the enhanced-interval linear programming (EILP method and provides approximate-optimal solutions at various risk levels. The proposed strategy was applied to the Swift Creek Reservoir’s nutrient TMDL allocation (Chesterfield County, VA to identify the minimum nutrient load allocations required from eight sub-watersheds to ensure compliance with user-specified chlorophyll criteria. Our results indicated that the BRRT-EILP model could identify critical sub-watersheds faster than the traditional one and requires lower reduction of nutrient loadings compared to traditional stochastic simulation and trial-and-error (TAE approaches. This suggests that our proposed framework performs better in optimal TMDL development compared to the traditional simulation-optimization models and provides extreme and non-extreme tradeoff analysis under uncertainty for risk-based decision making.
Morino, Yu; Ohara, Toshimasa; Yumimoto, Keiya
2014-05-01
Chemical transport models (CTM) played key roles in understanding the atmospheric behaviors and deposition patterns of radioactive materials emitted from the Fukushima Daiichi nuclear power plant (FDNPP) after the nuclear accident that accompanied the great Tohoku earthquake and tsunami on 11 March 2011. In this study, we assessed uncertainties of atmospheric simulation by comparing observed and simulated deposition of radiocesium (137Cs) and radioiodine (131I). Airborne monitoring survey data were used to assess the model performance of 137Cs deposition patterns. We found that simulation using emissions estimated with a regional-scale (~500 km) CTM better reproduced the observed 137Cs deposition pattern in eastern Japan than simulation using emissions estimated with local-scale (~50 km) or global-scale CTM. In addition, we estimated the emission amount of 137Cs from FDNPP by combining a CTM, a priori source term, and observed deposition data. This is the first use of airborne survey data of 137Cs deposition (more than 16,000 data points) as the observational constraints in inverse modeling. The model simulation driven by a posteriori source term achieved better agreements with 137Cs depositions measured by aircraft survey and at in-situ stations over eastern Japan. Wet deposition module was also evaluated. Simulation using a process-based wet deposition module reproduced the observations well, whereas simulation using scavenging coefficients showed large uncertainties associated with empirical parameters. The best-available simulation reproduced the observed 137Cs deposition rates in high-deposition areas (≥10 kBq m-2) within one order of magnitude. Recently, 131I deposition map was released and helped to evaluate model performance of 131I deposition patterns. Observed 131I/137Cs deposition ratio is higher in areas southwest of FDNPP than northwest of FDNPP, and this behavior was roughly reproduced by a CTM if we assume that released 131I is more in gas phase
A framework to quantify uncertainty in simulations of oil transport in the ocean
Gonçalves, Rafael C.
2016-03-02
An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model\\'s output to be presented in a probabilistic framework so that the model\\'s predictions reflect the uncertainty in the model\\'s input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model\\'s uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.
Zhuang, X. W.; Li, Y. P.; Nie, S.; Fan, Y. R.; Huang, G. H.
2018-01-01
An integrated simulation-optimization (ISO) approach is developed for assessing climate change impacts on water resources. In the ISO, uncertainties presented as both interval numbers and probability distributions can be reflected. Moreover, ISO permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised water-allocation targets are violated. A snowmelt-precipitation-driven watershed (Kaidu watershed) in northwest China is selected as the study case for demonstrating the applicability of the proposed method. Results of meteorological projections disclose that the incremental trend of temperature (e.g., minimum and maximum values) and precipitation exist. Results also reveal that (i) the system uncertainties would significantly affect water resources allocation pattern (including target and shortage); (ii) water shortage would be enhanced from 2016 to 2070; and (iii) the more the inflow amount decreases, the higher estimated water shortage rates are. The ISO method is useful for evaluating climate change impacts within a watershed system with complicated uncertainties and helping identify appropriate water resources management strategies hedging against drought.
Energy Technology Data Exchange (ETDEWEB)
Yankov, A.; Downar, T. [University of Michigan, 2355 Bonisteel Blvd, Ann Arbor, MI 48109 (United States)
2013-07-01
Recent efforts in the application of uncertainty quantification to nuclear systems have utilized methods based on generalized perturbation theory and stochastic sampling. While these methods have proven to be effective they both have major drawbacks that may impede further progress. A relatively new approach based on spectral elements for uncertainty quantification is applied in this paper to several problems in reactor simulation. Spectral methods based on collocation attempt to couple the approximation free nature of stochastic sampling methods with the determinism of generalized perturbation theory. The specific spectral method used in this paper employs both the Smolyak algorithm and adaptivity by using Newton-Cotes collocation points along with linear hat basis functions. Using this approach, a surrogate model for the outputs of a computer code is constructed hierarchically by adaptively refining the collocation grid until the interpolant is converged to a user-defined threshold. The method inherently fits into the framework of parallel computing and allows for the extraction of meaningful statistics and data that are not within reach of stochastic sampling and generalized perturbation theory. This paper aims to demonstrate the advantages of spectral methods-especially when compared to current methods used in reactor physics for uncertainty quantification-and to illustrate their full potential. (authors)
A framework to quantify uncertainty in simulations of oil transport in the ocean
Gonç alves, Rafael C.; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Chassignet, Eric; Knio, Omar
2016-01-01
An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model's output to be presented in a probabilistic framework so that the model's predictions reflect the uncertainty in the model's input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model's uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.
Energy Technology Data Exchange (ETDEWEB)
Knio, Omar [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science
2017-05-05
The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solution can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.
Raj, Rahul; van der Tol, Christiaan; Hamm, Nicholas Alexander Samuel; Stein, Alfred
2018-01-01
Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP) data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT), ratio of fine root carbon to leaf carbon (FRC : LC), ratio of carbon to nitrogen in leaf (C : Nleaf), canopy water interception coefficient (Wint), fraction of leaf nitrogen in RuBisCO (FLNR), and effective soil rooting depth (SD) characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash-Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.
Directory of Open Access Journals (Sweden)
R. Raj
2018-01-01
Full Text Available Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT, ratio of fine root carbon to leaf carbon (FRC : LC, ratio of carbon to nitrogen in leaf (C : Nleaf, canopy water interception coefficient (Wint, fraction of leaf nitrogen in RuBisCO (FLNR, and effective soil rooting depth (SD characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash–Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.
International Nuclear Information System (INIS)
Stscherbak, J.
1988-01-01
In unusually frank terms the author, a journalist and epidemiologist, describes the catastrophe of Chernobyl as the 'most pathetic and important' experience of the Soviet people after World War II. Documents, interviews and statements of persons concerned trace the disaster of those days that surpasses imagination and describe how individual persons witnessed the coming true of visions of terror. (orig./HSCH) [de
Directory of Open Access Journals (Sweden)
R. Uijlenhoet
2008-03-01
Full Text Available As rainfall constitutes the main source of water for the terrestrial hydrological processes, accurate and reliable measurement and prediction of its spatial and temporal distribution over a wide range of scales is an important goal for hydrology. We investigate the potential of ground-based weather radar to provide such measurements through a theoretical analysis of some of the associated observation uncertainties. A stochastic model of range profiles of raindrop size distributions is employed in a Monte Carlo simulation experiment to investigate the rainfall retrieval uncertainties associated with weather radars operating at X-, C-, and S-band. We focus in particular on the errors and uncertainties associated with rain-induced signal attenuation and its correction for incoherent, non-polarimetric, single-frequency, operational weather radars. The performance of two attenuation correction schemes, the (forward Hitschfeld-Bordan algorithm and the (backward Marzoug-Amayenc algorithm, is analyzed for both moderate (assuming a 50 km path length and intense Mediterranean rainfall (for a 30 km path. A comparison shows that the backward correction algorithm is more stable and accurate than the forward algorithm (with a bias in the order of a few percent for the former, compared to tens of percent for the latter, provided reliable estimates of the total path-integrated attenuation are available. Moreover, the bias and root mean square error associated with each algorithm are quantified as a function of path-averaged rain rate and distance from the radar in order to provide a plausible order of magnitude for the uncertainty in radar-retrieved rain rates for hydrological applications.
Effects of Input Data Content on the Uncertainty of Simulating Water Resources
Directory of Open Access Journals (Sweden)
Carla Camargos
2018-05-01
Full Text Available The widely used, partly-deterministic Soil and Water Assessment Tool (SWAT requires a large amount of spatial input data, such as a digital elevation model (DEM, land use, and soil maps. Modelers make an effort to apply the most specific data possible for the study area to reflect the heterogeneous characteristics of landscapes. Regional data, especially with fine resolution, is often preferred. However, such data is not always available and can be computationally demanding. Despite being coarser, global data are usually free and available to the public. Previous studies revealed the importance for single investigations of different input maps. However, it remains unknown whether higher-resolution data can lead to reliable results. This study investigates how global and regional input datasets affect parameter uncertainty when estimating river discharges. We analyze eight different setups for the SWAT model for a catchment in Luxembourg, combining different land-use, elevation, and soil input data. The Metropolis–Hasting Markov Chain Monte Carlo (MCMC algorithm is used to infer posterior model parameter uncertainty. We conclude that our higher resolved DEM improves the general model performance in reproducing low flows by 10%. The less detailed soil-map improved the fit of low flows by 25%. In addition, more detailed land-use maps reduce the bias of the model discharge simulations by 50%. Also, despite presenting similar parameter uncertainty (P-factor ranging from 0.34 to 0.41 and R-factor from 0.41 to 0.45 for all setups, the results show a disparate parameter posterior distribution. This indicates that no assessment of all sources of uncertainty simultaneously is compensated by the fitted parameter values. We conclude that our result can give some guidance for future SWAT applications in the selection of the degree of detail for input data.
International Nuclear Information System (INIS)
Kristof, Marian; Kliment, Tomas; Petruzzi, Alessandro; Lipka, Jozef
2009-01-01
Licensing calculations in a majority of countries worldwide still rely on the application of combined approach using best estimate computer code without evaluation of the code models uncertainty and conservative assumptions on initial and boundary, availability of systems and components and additional conservative assumptions. However best estimate plus uncertainty (BEPU) approach representing the state-of-the-art in the area of safety analysis has a clear potential to replace currently used combined approach. There are several applications of BEPU approach in the area of licensing calculations, but some questions are discussed, namely from the regulatory point of view. In order to find a proper solution to these questions and to support the BEPU approach to become a standard approach for licensing calculations, a broad comparison of both approaches for various transients is necessary. Results of one of such comparisons on the example of the VVER-440/213 NPP pressurizer surge line break event are described in this paper. A Kv-scaled simulation based on PH4-SLB experiment from PMK-2 integral test facility applying its volume and power scaling factor is performed for qualitative assessment of the RELAP5 computer code calculation using the VVER-440/213 plant model. Existing hardware differences are identified and explained. The CIAU method is adopted for performing the uncertainty evaluation. Results using combined and BEPU approaches are in agreement with the experimental values in PMK-2 facility. Only minimal difference between combined and BEPU approached has been observed in the evaluation of the safety margins for the peak cladding temperature. Benefits of the CIAU uncertainty method are highlighted.
Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.
2016-11-01
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.
muView: A Visual Analysis System for Exploring Uncertainty in Myocardial Ischemia Simulations
Rosen, Paul
2016-05-23
In this paper we describe the Myocardial Uncertainty Viewer (muView or μView) system for exploring data stemming from the simulation of cardiac ischemia. The simulation uses a collection of conductivity values to understand how ischemic regions effect the undamaged anisotropic heart tissue. The data resulting from the simulation is multi-valued and volumetric, and thus, for every data point, we have a collection of samples describing cardiac electrical properties. μView combines a suite of visual analysis methods to explore the area surrounding the ischemic zone and identify how perturbations of variables change the propagation of their effects. In addition to presenting a collection of visualization techniques, which individually highlight different aspects of the data, the coordinated view system forms a cohesive environment for exploring the simulations. We also discuss the findings of our study, which are helping to steer further development of the simulation and strengthening our collaboration with the biomedical engineers attempting to understand the phenomenon.
muView: A Visual Analysis System for Exploring Uncertainty in Myocardial Ischemia Simulations
Rosen, Paul; Burton, Brett; Potter, Kristin; Johnson, Chris R.
2016-01-01
In this paper we describe the Myocardial Uncertainty Viewer (muView or μView) system for exploring data stemming from the simulation of cardiac ischemia. The simulation uses a collection of conductivity values to understand how ischemic regions effect the undamaged anisotropic heart tissue. The data resulting from the simulation is multi-valued and volumetric, and thus, for every data point, we have a collection of samples describing cardiac electrical properties. μView combines a suite of visual analysis methods to explore the area surrounding the ischemic zone and identify how perturbations of variables change the propagation of their effects. In addition to presenting a collection of visualization techniques, which individually highlight different aspects of the data, the coordinated view system forms a cohesive environment for exploring the simulations. We also discuss the findings of our study, which are helping to steer further development of the simulation and strengthening our collaboration with the biomedical engineers attempting to understand the phenomenon.
Effects of Uncertainties in Electric Field Boundary Conditions for Ring Current Simulations
Chen, Margaret W.; O'Brien, T. Paul; Lemon, Colby L.; Guild, Timothy B.
2018-01-01
Physics-based simulation results can vary widely depending on the applied boundary conditions. As a first step toward assessing the effect of boundary conditions on ring current simulations, we analyze the uncertainty of cross-polar cap potentials (CPCP) on electric field boundary conditions applied to the Rice Convection Model-Equilibrium (RCM-E). The empirical Weimer model of CPCP is chosen as the reference model and Defense Meteorological Satellite Program CPCP measurements as the reference data. Using temporal correlations from a statistical analysis of the "errors" between the reference model and data, we construct a Monte Carlo CPCP discrete time series model that can be generalized to other model boundary conditions. RCM-E simulations using electric field boundary conditions from the reference model and from 20 randomly generated Monte Carlo discrete time series of CPCP are performed for two large storms. During the 10 August 2000 storm main phase, the proton density at 10 RE at midnight was observed to be low (Dst index is bounded by the simulated Dst values. In contrast, the simulated Dst values during the recovery phases of the 10 August 2000 and 31 August 2005 storms tend to underestimate systematically the observed late Dst recovery. This suggests a need to improve the accuracy of particle loss calculations in the RCM-E model. Application of this technique can aid modelers to make efficient choices on either investing more effort on improving specification of boundary conditions or on improving descriptions of physical processes.
Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.
2011-12-01
Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.
Directory of Open Access Journals (Sweden)
S. Arnold
2009-10-01
Full Text Available In this paper we develop and apply a conceptual ecohydrological model to investigate the effects of model structure and parameter uncertainty on the simulation of vegetation structure and hydrological dynamics. The model is applied for a typical water limited riparian ecosystem along an ephemeral river: the middle section of the Kuiseb River in Namibia. We modelled this system by coupling an ecological model with a conceptual hydrological model. The hydrological model is storage based with stochastical forcing from the flood. The ecosystem is modelled with a population model, and represents three dominating riparian plant populations. In appreciation of uncertainty about population dynamics, we applied three model versions with increasing complexity. Population parameters were found by Latin hypercube sampling of the parameter space and with the constraint that three species should coexist as observed. Two of the three models were able to reproduce the observed coexistence. However, both models relied on different coexistence mechanisms, and reacted differently to change of long term memory in the flood forcing. The coexistence requirement strongly constrained the parameter space for both successful models. Only very few parameter sets (0.5% of 150 000 samples allowed for coexistence in a representative number of repeated simulations (at least 10 out of 100 and the success of the coexistence mechanism was controlled by the combination of population parameters. The ensemble statistics of average values of hydrologic variables like transpiration and depth to ground water were similar for both models, suggesting that they were mainly controlled by the applied hydrological model. The ensemble statistics of the fluctuations of depth to groundwater and transpiration, however, differed significantly, suggesting that they were controlled by the applied ecological model and coexistence mechanisms. Our study emphasizes that uncertainty about ecosystem
Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow
Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca
2017-11-01
The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.
Direct catastrophic injury in sports.
Boden, Barry P
2005-11-01
Catastrophic sports injuries are rare but tragic events. Direct (traumatic) catastrophic injury results from participating in the skills of a sport, such as a collision in football. Football is associated with the greatest number of direct catastrophic injuries for all major team sports in the United States. Pole vaulting, gymnastics, ice hockey, and football have the highest incidence of direct catastrophic injuries for sports in which males participate. In most sports, the rate of catastrophic injury is higher at the collegiate than at the high school level. Cheerleading is associated with the highest number of direct catastrophic injuries for all sports in which females participate. Indirect (nontraumatic) injury is caused by systemic failure as a result of exertion while participating in a sport. Cardiovascular conditions, heat illness, exertional hyponatremia, and dehydration can cause indirect catastrophic injury. Understanding the common mechanisms of injury and prevention strategies for direct catastrophic injuries is critical in caring for athletes.
Janardhanan, S.; Datta, B.
2011-12-01
Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of
Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel
2013-06-01
To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.
Energy Technology Data Exchange (ETDEWEB)
Turinsky, Paul J [North Carolina State Univ., Raleigh, NC (United States); Abdel-Khalik, Hany S [North Carolina State Univ., Raleigh, NC (United States); Stover, Tracy E [North Carolina State Univ., Raleigh, NC (United States)
2011-03-01
to the design concept is quantitatively determined. A technique is then established to assimilate this data and produce posteriori uncertainties on key attributes and responses of the design concept. Several experiment perturbations based on engineering judgment are used to demonstrate these methods and also serve as an initial generation of the optimization problem. Finally, an optimization technique is developed which will simultaneously arrive at an optimized experiment to produce an optimized reactor design. Solution of this problem is made possible by the use of the simulated annealing algorithm for solution of optimization problems. The optimization examined in this work is based on maximizing the reactor cost savings associated with the modified design made possible by using the design margin gained through reduced basic nuclear data uncertainties. Cost values for experiment design specifications and reactor design specifications are established and used to compute a total savings by comparing the posteriori reactor cost to the a priori cost plus the cost of the experiment. The optimized solution arrives at a maximized cost savings.
Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining
2017-11-01
Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.
A Study on Data Base for the Pyroprocessing Material Flow and MUF Uncertainty Simulation
International Nuclear Information System (INIS)
Sitompul, Yos Panagaman; Shin, Heesung; Han, Boyoung; Kim, Hodong
2011-01-01
The data base for the pyroprocessing material flow and MUF uncertainty simulation has been implemented well. There is no error in the data base processing and it is relatively fast by using OLEDB and MySQL. The important issue is the data base size. In OLEDB the data base size is limited to 2 Gb. To reduce the data base size, we give an option for users to filter the input nuclides based on their masses and activities. A simulation program called PYMUS has been developed to study the pyroprocessing material flow and MUF. In the program, there is a data base system that controls the data processing in the simulation. The data base system consists of input data base, data processing, and output data base. The data base system has been designed in such a way to be efficient. One example is using the OLEDB and MySQL. The data base system is explained in detail in this paper. The result shows that the data base system works well in the simulation
Seo, John; Mahul, Olivier
2009-01-01
Catastrophe risk models allow insurers, reinsurers and governments to assess the risk of loss from catastrophic events, such as hurricanes. These models rely on computer technology and the latest earth and meteorological science information to generate thousands if not millions of simulated events. Recently observed hurricane activity, particularly in the 2004 and 2005 hurricane seasons, i...
International Nuclear Information System (INIS)
Ghione, Alberto; Noel, Brigitte; Vinai, Paolo; Demazière, Christophe
2017-01-01
Highlights: • A station blackout scenario in the Jules Horowitz Reactor is analyzed using CATHARE. • Input and model uncertainties relevant to the transient, are considered. • A statistical methodology for the propagation of the uncertainties is applied. • No safety criteria are exceeded and sufficiently large safety margins are estimated. • The most influential uncertainties are determined with a sensitivity analysis. - Abstract: An uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor (JHR) is presented. The JHR is a new material testing reactor under construction at CEA on the Cadarache site, France. The thermal-hydraulic system code CATHARE is applied to investigate the response of the reactor system to the scenario. The uncertainty and sensitivity study was based on a statistical methodology for code uncertainty propagation, and the ‘Uncertainty and Sensitivity’ platform URANIE was used. Accordingly, the input uncertainties relevant to the transient, were identified, quantified, and propagated to the code output. The results show that the safety criteria are not exceeded and sufficiently large safety margins exist. In addition, the most influential input uncertainties on the safety parameters were found by making use of a sensitivity analysis.
International Nuclear Information System (INIS)
Su Rui; Wang Ju; Chen Weiming; Zong Zihua; Zhao Honggang
2008-01-01
CRP-GEORC concept model is an artificial system of geological disposal for High-Level radioactive waste. Sensitivity analysis and uncertainties simulation of the migration of radionuclide Se-79 and I-129 in the far field of this system by using GoldSim Code have been conducted. It can be seen from the simulation results that variables used to describe the geological features and characterization of groundwater flow are sensitive variables of whole geological disposal system. The uncertainties of parameters have remarkable influence on the simulation results. (authors)
How model and input uncertainty impact maize yield simulations in West Africa
Waha, Katharina; Huth, Neil; Carberry, Peter; Wang, Enli
2015-02-01
Crop models are common tools for simulating crop yields and crop production in studies on food security and global change. Various uncertainties however exist, not only in the model design and model parameters, but also and maybe even more important in soil, climate and management input data. We analyze the performance of the point-scale crop model APSIM and the global scale crop model LPJmL with different climate and soil conditions under different agricultural management in the low-input maize-growing areas of Burkina Faso, West Africa. We test the models’ response to different levels of input information from little to detailed information on soil, climate (1961-2000) and agricultural management and compare the models’ ability to represent the observed spatial (between locations) and temporal variability (between years) in crop yields. We found that the resolution of different soil, climate and management information influences the simulated crop yields in both models. However, the difference between models is larger than between input data and larger between simulations with different climate and management information than between simulations with different soil information. The observed spatial variability can be represented well from both models even with little information on soils and management but APSIM simulates a higher variation between single locations than LPJmL. The agreement of simulated and observed temporal variability is lower due to non-climatic factors e.g. investment in agricultural research and development between 1987 and 1991 in Burkina Faso which resulted in a doubling of maize yields. The findings of our study highlight the importance of scale and model choice and show that the most detailed input data does not necessarily improve model performance.
Microtubule catastrophe and rescue.
Gardner, Melissa K; Zanic, Marija; Howard, Jonathon
2013-02-01
Microtubules are long cylindrical polymers composed of tubulin subunits. In cells, microtubules play an essential role in architecture and motility. For example, microtubules give shape to cells, serve as intracellular transport tracks, and act as key elements in important cellular structures such as axonemes and mitotic spindles. To accomplish these varied functions, networks of microtubules in cells are very dynamic, continuously remodeling through stochastic length fluctuations at the ends of individual microtubules. The dynamic behavior at the end of an individual microtubule is termed 'dynamic instability'. This behavior manifests itself by periods of persistent microtubule growth interrupted by occasional switching to rapid shrinkage (called microtubule 'catastrophe'), and then by switching back from shrinkage to growth (called microtubule 'rescue'). In this review, we summarize recent findings which provide new insights into the mechanisms of microtubule catastrophe and rescue, and discuss the impact of these findings in regards to the role of microtubule dynamics inside of cells. Copyright © 2012 Elsevier Ltd. All rights reserved.
Catastrophic primary antiphospholipid syndrome
International Nuclear Information System (INIS)
Kim, Dong Hun; Byun, Joo Nam; Ryu, Sang Wan
2006-01-01
Catastrophic antiphospholipid syndrome (CAPLS) was diagnosed in a 64-year-old male who was admitted to our hospital with dyspnea. The clinical and radiological examinations showed pulmonary thromboembolism, and so thromboembolectomy was performed. Abdominal distension rapidly developed several days later, and the abdominal computed tomography (CT) abdominal scan revealed thrombus within the superior mesenteric artery with small bowel and gall bladder distension. Cholecystectomy and jejunoileostomy were performed, and gall bladder necrosis and small bowel infarction were confirmed. The anticardiolipin antibody was positive. Anticoagulant agents and steroids were administered, but the patient expired 4 weeks after surgery due to acute respiratory distress syndrome (ARDS). We report here on a case of catastrophic APLS with manifestations of pulmonary thromboembolism, rapidly progressing GB necrosis and bowel infarction
Catastrophic primary antiphospholipid syndrome
Energy Technology Data Exchange (ETDEWEB)
Kim, Dong Hun; Byun, Joo Nam [Chosun University Hospital, Gwangju (Korea, Republic of); Ryu, Sang Wan [Miraero21 Medical Center, Gwangju (Korea, Republic of)
2006-09-15
Catastrophic antiphospholipid syndrome (CAPLS) was diagnosed in a 64-year-old male who was admitted to our hospital with dyspnea. The clinical and radiological examinations showed pulmonary thromboembolism, and so thromboembolectomy was performed. Abdominal distension rapidly developed several days later, and the abdominal computed tomography (CT) abdominal scan revealed thrombus within the superior mesenteric artery with small bowel and gall bladder distension. Cholecystectomy and jejunoileostomy were performed, and gall bladder necrosis and small bowel infarction were confirmed. The anticardiolipin antibody was positive. Anticoagulant agents and steroids were administered, but the patient expired 4 weeks after surgery due to acute respiratory distress syndrome (ARDS). We report here on a case of catastrophic APLS with manifestations of pulmonary thromboembolism, rapidly progressing GB necrosis and bowel infarction.
Ehrhardt, Fiona; Soussana, Jean-François; Bellocchi, Gianni; Grace, Peter; McAuliffe, Russel; Recous, Sylvie; Sándor, Renáta; Smith, Pete; Snow, Val; de Antoni Migliorati, Massimiliano; Basso, Bruno; Bhatia, Arti; Brilli, Lorenzo; Doltra, Jordi; Dorich, Christopher D; Doro, Luca; Fitton, Nuala; Giacomini, Sandro J; Grant, Brian; Harrison, Matthew T; Jones, Stephanie K; Kirschbaum, Miko U F; Klumpp, Katja; Laville, Patricia; Léonard, Joël; Liebig, Mark; Lieffering, Mark; Martin, Raphaël; Massad, Raia S; Meier, Elizabeth; Merbold, Lutz; Moore, Andrew D; Myrgiotis, Vasileios; Newton, Paul; Pattey, Elizabeth; Rolinski, Susanne; Sharp, Joanna; Smith, Ward N; Wu, Lianhai; Zhang, Qing
2018-02-01
Simulation models are extensively used to predict agricultural productivity and greenhouse gas emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multi-species agricultural contexts. We report an international model comparison and benchmarking exercise, showing the potential of multi-model ensembles to predict productivity and nitrous oxide (N 2 O) emissions for wheat, maize, rice and temperate grasslands. Using a multi-stage modelling protocol, from blind simulations (stage 1) to partial (stages 2-4) and full calibration (stage 5), 24 process-based biogeochemical models were assessed individually or as an ensemble against long-term experimental data from four temperate grassland and five arable crop rotation sites spanning four continents. Comparisons were performed by reference to the experimental uncertainties of observed yields and N 2 O emissions. Results showed that across sites and crop/grassland types, 23%-40% of the uncalibrated individual models were within two standard deviations (SD) of observed yields, while 42 (rice) to 96% (grasslands) of the models were within 1 SD of observed N 2 O emissions. At stage 1, ensembles formed by the three lowest prediction model errors predicted both yields and N 2 O emissions within experimental uncertainties for 44% and 33% of the crop and grassland growth cycles, respectively. Partial model calibration (stages 2-4) markedly reduced prediction errors of the full model ensemble E-median for crop grain yields (from 36% at stage 1 down to 4% on average) and grassland productivity (from 44% to 27%) and to a lesser and more variable extent for N 2 O emissions. Yield-scaled N 2 O emissions (N 2 O emissions divided by crop yields) were ranked accurately by three-model ensembles across crop species and field sites. The potential of using process-based model ensembles to predict jointly
Uncertainty quantification tools for multiphase gas-solid flow simulations using MFIX
Energy Technology Data Exchange (ETDEWEB)
Fox, Rodney O. [Iowa State Univ., Ames, IA (United States); Passalacqua, Alberto [Iowa State Univ., Ames, IA (United States)
2016-02-01
Computational fluid dynamics (CFD) has been widely studied and used in the scientific community and in the industry. Various models were proposed to solve problems in different areas. However, all models deviate from reality. Uncertainty quantification (UQ) process evaluates the overall uncertainties associated with the prediction of quantities of interest. In particular it studies the propagation of input uncertainties to the outputs of the models so that confidence intervals can be provided for the simulation results. In the present work, a non-intrusive quadrature-based uncertainty quantification (QBUQ) approach is proposed. The probability distribution function (PDF) of the system response can be then reconstructed using extended quadrature method of moments (EQMOM) and extended conditional quadrature method of moments (ECQMOM). The report first explains the theory of QBUQ approach, including methods to generate samples for problems with single or multiple uncertain input parameters, low order statistics, and required number of samples. Then methods for univariate PDF reconstruction (EQMOM) and multivariate PDF reconstruction (ECQMOM) are explained. The implementation of QBUQ approach into the open-source CFD code MFIX is discussed next. At last, QBUQ approach is demonstrated in several applications. The method is first applied to two examples: a developing flow in a channel with uncertain viscosity, and an oblique shock problem with uncertain upstream Mach number. The error in the prediction of the moment response is studied as a function of the number of samples, and the accuracy of the moments required to reconstruct the PDF of the system response is discussed. The QBUQ approach is then demonstrated by considering a bubbling fluidized bed as example application. The mean particle size is assumed to be the uncertain input parameter. The system is simulated with a standard two-fluid model with kinetic theory closures for the particulate phase implemented into
International Nuclear Information System (INIS)
Mallet, Vivien
2005-01-01
The thesis deals with the evaluation of a chemistry-transport model, not primarily with classical comparisons to observations, but through the estimation of its a priori uncertainties due to input data, model formulation and numerical approximations. These three uncertainty sources are studied respectively on the basis of Monte Carlos simulations, multi-models simulations and numerical schemes inter-comparisons. A high uncertainty is found, in output ozone concentrations. In order to overtake the limitations due to the uncertainty, a solution is ensemble forecast. Through combinations of several models (up to forty-eight models) on the basis of past observations, the forecast can be significantly improved. The achievement of this work has also led to develop the innovative modelling-system Polyphemus. (author) [fr
Wells, J. R.; Kim, J. B.
2011-12-01
Parameters in dynamic global vegetation models (DGVMs) are thought to be weakly constrained and can be a significant source of errors and uncertainties. DGVMs use between 5 and 26 plant functional types (PFTs) to represent the average plant life form in each simulated plot, and each PFT typically has a dozen or more parameters that define the way it uses resource and responds to the simulated growing environment. Sensitivity analysis explores how varying parameters affects the output, but does not do a full exploration of the parameter solution space. The solution space for DGVM parameter values are thought to be complex and non-linear; and multiple sets of acceptable parameters may exist. In published studies, PFT parameters are estimated from published literature, and often a parameter value is estimated from a single published value. Further, the parameters are "tuned" using somewhat arbitrary, "trial-and-error" methods. BIOMAP is a new DGVM created by fusing MAPSS biogeography model with Biome-BGC. It represents the vegetation of North America using 26 PFTs. We are using simulated annealing, a global search method, to systematically and objectively explore the solution space for the BIOMAP PFTs and system parameters important for plant water use. We defined the boundaries of the solution space by obtaining maximum and minimum values from published literature, and where those were not available, using +/-20% of current values. We used stratified random sampling to select a set of grid cells representing the vegetation of the conterminous USA. Simulated annealing algorithm is applied to the parameters for spin-up and a transient run during the historical period 1961-1990. A set of parameter values is considered acceptable if the associated simulation run produces a modern potential vegetation distribution map that is as accurate as one produced by trial-and-error calibration. We expect to confirm that the solution space is non-linear and complex, and that
Directory of Open Access Journals (Sweden)
Mousong Wu
2016-02-01
Full Text Available Water and energy processes in frozen soils are important for better understanding hydrologic processes and water resources management in cold regions. To investigate the water and energy balance in seasonally frozen soils, CoupModel combined with the generalized likelihood uncertainty estimation (GLUE method was used. Simulation work on water and heat processes in frozen soil in northern China during the 2012/2013 winter was conducted. Ensemble simulations through the Monte Carlo sampling method were generated for uncertainty analysis. Behavioral simulations were selected based on combinations of multiple model performance index criteria with respect to simulated soil water and temperature at four depths (5 cm, 15 cm, 25 cm, and 35 cm. Posterior distributions for parameters related to soil hydraulic, radiation processes, and heat transport indicated that uncertainties in both input and model structures could influence model performance in modeling water and heat processes in seasonally frozen soils. Seasonal courses in water and energy partitioning were obvious during the winter. Within the day-cycle, soil evaporation/condensation and energy distributions were well captured and clarified as an important phenomenon in the dynamics of the energy balance system. The combination of the CoupModel simulations with the uncertainty-based calibration method provides a way of understanding the seasonal courses of hydrology and energy processes in cold regions with limited data. Additional measurements may be used to further reduce the uncertainty of regulating factors during the different stages of freezing–thawing.
Phenomena-based Uncertainty Quantification in Predictive Coupled- Physics Reactor Simulations
Energy Technology Data Exchange (ETDEWEB)
Adams, Marvin [Texas A & M Univ., College Station, TX (United States)
2017-06-12
This project has sought to develop methodologies, tailored to phenomena that govern nuclearreactor behavior, to produce predictions (including uncertainties) for quantities of interest (QOIs) in the simulation of steady-state and transient reactor behavior. Examples of such predictions include, for each QOI, an expected value as well as a distribution around this value and an assessment of how much of the distribution stems from each major source of uncertainty. The project has sought to test its methodologies by comparing against measured experimental outcomes. The main experimental platform has been a 1-MW TRIGA reactor. This is a flexible platform for a wide range of experiments, including steady state with and without temperature feedback, slow transients with and without feedback, and rapid transients with strong feedback. The original plan was for the primary experimental data to come from in-core neutron detectors. We made considerable progress toward this goal but did not get as far along as we had planned. We have designed, developed, installed, and tested vertical guide tubes, each able to accept a detector or stack of detectors that can be moved axially inside the tube, and we have tested several new detector designs. One of these shows considerable promise.
Bador, M.; Donat, M.; Geoffroy, O.; Alexander, L. V.
2017-12-01
Precipitation intensity during extreme events is expected to increase with climate change. Throughout the 21st century, CMIP5 climate models project a general increase in annual extreme precipitation in most regions. We investigate how robust this future increase is across different models, regions and seasons. We find that there is strong similarity in extreme precipitation changes between models that share atmospheric physics, reducing the ensemble of 27 models to 14 independent projections. We find that future simulated extreme precipitation increases in most models in the majority of land grid cells located in the dry, intermediate and wet regions according to each model's precipitation climatology. These increases significantly exceed the range of natural variability estimated from long equilibrium control runs. The intensification of extreme precipitation across the entire spectrum of dry to wet regions is particularly robust in the extra-tropics in both wet and dry season, whereas uncertainties are larger in the tropics. The CMIP5 ensemble therefore indicates robust future intensification of annual extreme rainfall in particular in extra-tropical regions. Generally, the CMIP5 robustness is higher during the dry season compared to the wet season and the annual scale, but inter-model uncertainties in the tropics remain important.
Phenomena-based Uncertainty Quantification in Predictive Coupled- Physics Reactor Simulations
International Nuclear Information System (INIS)
Adams, Marvin
2017-01-01
This project has sought to develop methodologies, tailored to phenomena that govern nuclearreactor behavior, to produce predictions (including uncertainties) for quantities of interest (QOIs) in the simulation of steady-state and transient reactor behavior. Examples of such predictions include, for each QOI, an expected value as well as a distribution around this value and an assessment of how much of the distribution stems from each major source of uncertainty. The project has sought to test its methodologies by comparing against measured experimental outcomes. The main experimental platform has been a 1-MW TRIGA reactor. This is a flexible platform for a wide range of experiments, including steady state with and without temperature feedback, slow transients with and without feedback, and rapid transients with strong feedback. The original plan was for the primary experimental data to come from in-core neutron detectors. We made considerable progress toward this goal but did not get as far along as we had planned. We have designed, developed, installed, and tested vertical guide tubes, each able to accept a detector or stack of detectors that can be moved axially inside the tube, and we have tested several new detector designs. One of these shows considerable promise.
Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher
2017-11-01
Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.
International Nuclear Information System (INIS)
Berne, A.
2001-01-01
Quantitative determinations of many radioactive analytes in environmental samples are based on a process in which several independent measurements of different properties are taken. The final results that are calculated using the data have to be evaluated for accuracy and precision. The estimate of the standard deviation, s, also called the combined standard uncertainty (CSU) associated with the result of this combined measurement can be used to evaluate the precision of the result. The CSU can be calculated by applying the law of propagation of uncertainty, which is based on the Taylor series expansion of the equation used to calculate the analytical result. The estimate of s can also be obtained from a Monte Carlo simulation. The data used in this simulation includes the values resulting from the individual measurements, the estimate of the variance of each value, including the type of distribution, and the equation used to calculate the analytical result. A comparison is made between these two methods of estimating the uncertainty of the calculated result. (author)
Effects of input data information content on the uncertainty of simulating water resources
Camargos, Carla; Julich, Stefan; Bach, Martin; Breuer, Lutz
2017-04-01
Hydrological models like the Soil and Water Assessment Tool (SWAT) demand a large variety of spatial input data. These are commonly available in different resolutions and result from different preprocessing methodologies. Effort is made to apply the most specific data as possible for the study area, which features heterogeneous landscape elements. Most often, modelers prefer to use regional data, especially with fine resolution, which is not always available. Instead, global datasets are considered that are more general. This study investigates how the use of global and regional input datasets may affect the simulation performance and uncertainty of the model. We analyzed eight different setups for the SWAT model, combining two of each Digital Elevation Models (DEM), soil and land use maps of diverse spatial resolution and information content. The models were calibrated to discharge at two stations across the mesoscale Haute-Sûre catchment, which is partly located in the north of Luxembourg and partly in the southeast of Belgium. The region is a rural area of about 743 km2 and mainly covered by forests and complex agricultural system and arable lands. As part of the catchment, the Upper-Sûre Lake is an important source of drinking water for Luxembourgish population, satisfying 30% of the country's demand. The Metropolis Markov Chain Monte Carlo algorithm implemented in the SPOTPY python package was used to infer posterior parameter distributions and assess parameter uncertainty. We are optimizing the mean of the Nash-Sutcliffe Efficiency (NSE) and the logarithm of NSE. We focused on soil physical, groundwater, main channel, land cover management and basin physical process parameters. Preliminary results indicate that the model has the best performance when using the regional DEM and land use map and the global soil map, indicating that SWAT cannot necessarily make use of additional soil information if they are not substantially effecting soil hydrological fluxes
On the predictivity of pore-scale simulations: estimating uncertainties with multilevel Monte Carlo
Icardi, Matteo
2016-02-08
A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another “equivalent” sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [2015. https://bitbucket.org/micardi/porescalemc.], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers
Academic Training: Predicting Natural Catastrophes
Françoise Benz
2005-01-01
2005-2006 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 12, 13, 14, 15, 16 December from 11:00 to 12:00 - Main Auditorium, bldg. 500 Predicting Natural Catastrophes E. OKAL / Northwestern University, Evanston, USA 1. Tsunamis -- Introduction Definition of phenomenon - basic properties of the waves Propagation and dispersion Interaction with coasts - Geological and societal effects Origin of tsunamis - natural sources Scientific activities in connection with tsunamis. Ideas about simulations 2. Tsunami generation The earthquake source - conventional theory The earthquake source - normal mode theory The landslide source Near-field observation - The Plafker index Far-field observation - Directivity 3. Tsunami warning General ideas - History of efforts Mantle magnitudes and TREMOR algorithms The challenge of 'tsunami earthquakes' Energy-moment ratios and slow earthquakes Implementation and the components of warning centers 4. Tsunami surveys Principles and methodologies Fifteen years of field surveys and re...
Directory of Open Access Journals (Sweden)
Ciumas Cristina
2013-07-01
Full Text Available This paper presents the emergence and evolution of catastrophe models (cat models. Starting with the present context of extreme weather events and features of catastrophic risk (cat risk we’ll make a chronological illustration from a theoretical point of view of the main steps taken for building such models. In this way the importance of interdisciplinary can be observed. The first cat model considered contains three modules. For each of these indentified modules: hazard, vulnerability and financial losses a detailed overview and also an exemplification of a potential case of an earthquake that measures more than 7 on Richter scale occurring nowadays in Bucharest will be provided. The key areas exposed to earthquake in Romania will be identified. Then, based on past catastrophe data and taking into account present conditions of housing stock, insurance coverage and the population of Bucharest the impact will be quantified by determining potential losses. In order to accomplish this work we consider a scenario with data representing average values for: dwelling’s surface, location, finishing works. On each step we’ll make a reference to the earthquake on March 4 1977 to see what would happen today if a similar event occurred. The value of Bucharest housing stock will be determined taking firstly the market value, then the replacement value and ultimately the real value to quantify potential damages. Through this approach we can find the insurance coverage of potential losses and also the uncovered gap. A solution that may be taken into account by public authorities, for example by Bucharest City Hall will be offered: in case such an event occurs the impossibility of paying compensations to insured people, rebuilding infrastructure and public buildings and helping the suffering persons should be avoided. An actively public-private partnership should be created between government authorities, the Natural Disaster Insurance Pool, private
Liebermann, Ralf; Kraft, Philipp; Houska, Tobias; Breuer, Lutz; Müller, Christoph; Kraus, David; Haas, Edwin; Klatt, Steffen
2015-04-01
Among anthropogenic greenhouse gas emissions, CO2 is the dominant driver of global climate change. Next to its direct impact on the radiation budget, it also affects the climate system by triggering feedback mechanisms in terrestrial ecosystems. Such mechanisms - like stimulated photosynthesis, increased root exudations and reduced stomatal transpiration - influence both the input and the turnover of carbon and nitrogen compounds in the soil. The stabilization and decomposition of these compounds determines how increasing CO2 concentrations change the terrestrial trace gas emissions, especially CO2, N2O and CH4. To assess the potential reaction of terrestrial greenhouse gas emissions to rising tropospheric CO2 concentration, we make use of a comprehensive ecosystem model integrating known processes and fluxes of the carbon-nitrogen cycle in soil, vegetation and water. We apply a state-of-the-art ecosystem model with measurements from a long term field experiment of CO2 enrichment. The model - a grassland realization of LandscapeDNDC - simulates soil chemistry coupled with plant physiology, microclimate and hydrology. The data - comprising biomass, greenhouse gas emissions, management practices and soil properties - has been attained from a FACE (Free Air Carbon dioxide Enrichment) experiment running since 1997 on a temperate grassland in Giessen, Germany. Management and soil data, together with weather records, are used to drive the model, while cut biomass as well as CO2 and N2O emissions are used for calibration and validation. Starting with control data from installations without CO2 enhancement, we begin with a GLUE (General Likelihood Uncertainty Estimation) assessment using Latin Hypercube to reduce the range of the model parameters. This is followed by a detailed sensitivity analysis, the application of DREAM-ZS for model calibration, and an estimation of the effect of input uncertainty on the simulation results. Since first results indicate problems with
DEFF Research Database (Denmark)
Prunescu, Remus Mihail; Sin, Gürkan
2014-01-01
This study presents the uncertainty and sensitivity analysis of a lignocellulosic enzymatic hydrolysis model considering both model and feed parameters as sources of uncertainty. The dynamic model is parametrized for accommodating various types of biomass, and different enzymatic complexes...
Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang
2016-04-01
Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.
Assimaki, D.; Li, W.; Steidl, J. M.; Schmedes, J.
2007-12-01
The assessment of strong motion site response is of great significance, both for mitigating seismic hazard and for performing detailed analyses of earthquake source characteristics. There currently exists, however, large degree of uncertainty concerning the mathematical model to be employed for the computationally efficient evaluation of local site effects, and the site investigation program necessary to evaluate the nonlinear input model parameters and ensure cost-effective predictions; and while site response observations may provide critical constraints on interpretation methods, the lack of a statistically significant number of in-situ strong motion records prohibits statistical analyses to be conducted and uncertainties to be quantified based entirely on field data. In this paper, we combine downhole observations and broadband ground motion synthetics for characteristic site conditions the Los Angeles Basin, and investigate the variability in ground motion estimation introduced by the site response assessment methodology. In particular, site-specific regional velocity and attenuation structures are initially compiled using near-surface geotechnical data collected at downhole geotechnical arrays, inverse low-strain velocity and attenuation profiles at these sites obtained by inversion of weak motion records and the crustal velocity structure at the corresponding locations obtained from the Southern California Earthquake Centre Community Velocity Model. Successively, broadband ground motions are simulated by means of a hybrid low/high-frequency finite source model with correlated random parameters for rupture scenaria of weak, medium and large magnitude events (M =3.5-7.5). Observed estimates of site response at the stations of interest are first compared to the ensemble of approximate and incremental nonlinear site response models. Parametric studies are next conducted for each fixed magnitude (fault geometry) scenario by varying the source-to-site distance and
A Study on the uncertainty and sensitivity in numerical simulation of parametric roll
DEFF Research Database (Denmark)
Choi, Ju-hyuck; Nielsen, Ulrik Dam; Jensen, Jørgen Juncher
2016-01-01
Uncertainties related to numerical modelling of parametric roll have been investigated by using a 6-DOFs model with nonlinear damping and roll restoring forces. At first, uncertainty on damping coefficients and its effect on the roll response is evaluated. Secondly, uncertainty due to the “effect...
Estimation of the uncertainty of a climate model using an ensemble simulation
Barth, A.; Mathiot, P.; Goosse, H.
2012-04-01
The atmospheric forcings play an important role in the study of the ocean and sea-ice dynamics of the Southern Ocean. Error in the atmospheric forcings will inevitably result in uncertain model results. The sensitivity of the model results to errors in the atmospheric forcings are studied with ensemble simulations using multivariate perturbations of the atmospheric forcing fields. The numerical ocean model used is the NEMO-LIM in a global configuration with an horizontal resolution of 2°. NCEP reanalyses are used to provide air temperature and wind data to force the ocean model over the last 50 years. A climatological mean is used to prescribe relative humidity, cloud cover and precipitation. In a first step, the model results is compared with OSTIA SST and OSI SAF sea ice concentration of the southern hemisphere. The seasonal behavior of the RMS difference and bias in SST and ice concentration is highlighted as well as the regions with relatively high RMS errors and biases such as the Antarctic Circumpolar Current and near the ice-edge. Ensemble simulations are performed to statistically characterize the model error due to uncertainties in the atmospheric forcings. Such information is a crucial element for future data assimilation experiments. Ensemble simulations are performed with perturbed air temperature and wind forcings. A Fourier decomposition of the NCEP wind vectors and air temperature for 2007 is used to generate ensemble perturbations. The perturbations are scaled such that the resulting ensemble spread matches approximately the RMS differences between the satellite SST and sea ice concentration. The ensemble spread and covariance are analyzed for the minimum and maximum sea ice extent. It is shown that errors in the atmospheric forcings can extend to several hundred meters in depth near the Antarctic Circumpolar Current.
The critical catastrophe revisited
International Nuclear Information System (INIS)
De Mulatier, Clélia; Rosso, Alberto; Dumonteil, Eric; Zoia, Andrea
2015-01-01
The neutron population in a prototype model of nuclear reactor can be described in terms of a collection of particles confined in a box and undergoing three key random mechanisms: diffusion, reproduction due to fissions, and death due to absorption events. When the reactor is operated at the critical point, and fissions are exactly compensated by absorptions, the whole neutron population might in principle go to extinction because of the wild fluctuations induced by births and deaths. This phenomenon, which has been named critical catastrophe, is nonetheless never observed in practice: feedback mechanisms acting on the total population, such as human intervention, have a stabilizing effect. In this work, we revisit the critical catastrophe by investigating the spatial behaviour of the fluctuations in a confined geometry. When the system is free to evolve, the neutrons may display a wild patchiness (clustering). On the contrary, imposing a population control on the total population acts also against the local fluctuations, and may thus inhibit the spatial clustering. The effectiveness of population control in quenching spatial fluctuations will be shown to depend on the competition between the mixing time of the neutrons (i.e. the average time taken for a particle to explore the finite viable space) and the extinction time
Thom, René
1983-01-01
René Thom, mathématicien français, membre de l'Académie des Sciences, s'est vu décerner en 1958 la médaille Field, équivalent du Prix Nobel en mathématiques, pour ses créations intellectuelles, la " théorie des catastrophes ", regard nouveau sur toutes les transformations qui adviennent de manière brusque, imprévisible, dramatique. Dans ces entretiens qui vont de la mathématique à l'embryologie, de la linguistique à l'anthropologie et à l'histoire, René Thom expose les grandes lignes de la théorie des catastrophes et passe en revue, avec un esprit à la fois critique et passionné, les grands thèmes scientifiques de notre époque, de la physique atomique à la biologie moléculaire, du " progrès " scientifique et technologique aux connexions complexes entre la société et la science. " Ce petit livre est une extraordinaire réussite en vulgarisation ". (Jean Largeault)
Quantum catastrophe of slow light
Leonhardt, Ulf
2001-01-01
Catastrophes are at the heart of many fascinating optical phenomena. The rainbow, for example, is a ray catastrophe where light rays become infinitely intense. The wave nature of light resolves the infinities of ray catastrophes while drawing delicate interference patterns such as the supernumerary arcs of the rainbow. Black holes cause wave singularities. Waves oscillate with infinitely small wave lengths at the event horizon where time stands still. The quantum nature of light avoids this h...
Torrealba, Victor A.
2017-10-02
Recovery mechanisms are more likely to be influenced by grid-block size and reservoir heterogeneity in Chemical EOR (CEOR) than in conventional Water Flood (WF) simulations. Grid upscaling based on single-phase flow is a common practice in WF simulation models, where simulation grids are coarsened to perform history matching and sensitivity analyses within affordable computational times. This coarse grid resolution (typically about 100 ft.) could be sufficient in WF, however, it usually fails to capture key physical mechanisms in CEOR. In addition to increased numerical dispersion in coarse models, these models tend to artificially increase the level of mixing between the fluids and may not have enough resolution to capture different length scales of geological features to which EOR processes can be highly sensitive. As a result of which, coarse models usually overestimate the sweep efficiency, and underestimate the displacement efficiency. Grid refinement (simple downscaling) can resolve artificial mixing but appropriately re-creating the fine-scale heterogeneity, without degrading the history-match conducted on the coarse-scale, remains a challenge. Because of the difference in recovery mechanisms involved in CEOR, such as miscibility and thermodynamic phase split, the impact of grid downscaling on CEOR simulations is not well understood. In this work, we introduce a geostatistical downscaling method conditioned to tracer data to refine a coarse history-matched WF model. This downscaling process is necessary for CEOR simulations when the original (fine) earth model is not available or when major disconnects occur between the original earth model and the history-matched coarse WF model. The proposed downscaling method is a process of refining the coarse grid, and populating the relevant properties in the newly created finer grid cells. The method considers the values of rock properties in the coarse grid as hard data, and the corresponding variograms and property
International Nuclear Information System (INIS)
Ánchel, F.; Barrachina, T.; Miró, R.; Verdú, G.; Juanas, J.; Macián-Juan, R.
2012-01-01
Highlights: ► Best-estimate codes are affected by the uncertainty in the methods and the models. ► Influence of the uncertainty in the macroscopic cross-sections in a BWR and PWR RIA accidents analysis. ► The fast diffusion coefficient, the scattering cross section and both fission cross sections are the most influential factors. ► The absorption cross sections very little influence. ► Using a normal pdf the results are more “conservative” comparing the power peak reached with uncertainty quantified with a uniform pdf. - Abstract: The Best Estimate analysis consists of a coupled thermal-hydraulic and neutronic description of the nuclear system's behavior; uncertainties from both aspects should be included and jointly propagated. This paper presents a study of the influence of the uncertainty in the macroscopic neutronic information that describes a three-dimensional core model on the most relevant results of the simulation of a Reactivity Induced Accident (RIA). The analyses of a BWR-RIA and a PWR-RIA have been carried out with a three-dimensional thermal-hydraulic and neutronic model for the coupled system TRACE-PARCS and RELAP-PARCS. The cross section information has been generated by the SIMTAB methodology based on the joint use of CASMO-SIMULATE. The statistically based methodology performs a Monte-Carlo kind of sampling of the uncertainty in the macroscopic cross sections. The size of the sampling is determined by the characteristics of the tolerance intervals by applying the Noether–Wilks formulas. A number of simulations equal to the sample size have been carried out in which the cross sections used by PARCS are directly modified with uncertainty, and non-parametric statistical methods are applied to the resulting sample of the values of the output variables to determine their intervals of tolerance.
Agricultural system models have become important tools in studying water and nitrogen (N) dynamics, as well as crop growth, under different management practices. Complexity in input parameters often leads to significant uncertainty when simulating dynamic processes such as nitrate leaching or crop y...
Struck, C.; Wilde, de P.J.C.J.; Hopfe, C.J.; Hensen, J.L.M.
2008-01-01
This paper describes research conducted to gather empirical evidence on extent, character and content of the option space in building design projects, from the perspective of a climate engineer using building performance simulation for concept evaluation. The goal is to support uncertainty analysis
Goderniaux, Pascal; Brouyère, Serge; Blenkinsop, Stephen; Burton, Aidan; Fowler, Hayley; Dassargues, Alain
2010-05-01
applied not only to the mean of climatic variables, but also across the statistical distributions of these variables. This is important as these distributions are expected to change in the future, with more extreme rainfall events, separated by longer dry periods. (2) The novel approach used in this study can simulate transient climate change from 2010 to 2085, rather than time series representative of a stationary climate for the period 2071-2100. (3) The weather generator is used to generate a large number of equiprobable climate change scenarios for each RCM, representative of the natural variability of the weather. All of these scenarios are applied as input to the Geer basin model to assess the projected impact of climate change on groundwater levels, the uncertainty arising for different RCM projections and the uncertainty linked to natural climatic variability. Using the output results from all scenarios, 95% confidence intervals are calculated for each year and month between 2010 and 2085. The climate change scenarios for the Geer basin model predict hotter and drier summers and warmer and wetter winters. Considering the results of this study, it is very likely that groundwater levels and surface flow rates in the Geer basin will decrease by the end of the century. This is of concern because it also means that groundwater quantities available for abstraction will also decrease. However, this study also shows that the uncertainty of these projections is relatively large compared to the projected changes so that it remains difficult to confidently determine the magnitude of the decrease. The use and combination of an integrated surface - subsurface model and stochastic climate change scenarios has never been used in previous climate change impact studies on groundwater resources. It constitutes an innovation and is an important tool for helping water managers to take decisions.
International Nuclear Information System (INIS)
Gonçalves, L D; Rocco, E M; De Moraes, R V; Kuga, H K
2015-01-01
This paper aims to simulate part of the orbital trajectory of Lunar Prospector mission to analyze the relevance of using a Kalman filter to estimate the trajectory. For this study it is considered the disturbance due to the lunar gravitational potential using one of the most recent models, the LP100K model, which is based on spherical harmonics, and considers the maximum degree and order up to the value 100. In order to simplify the expression of the gravitational potential and, consequently, to reduce the computational effort required in the simulation, in some cases, lower values for degree and order are used. Following this aim, it is made an analysis of the inserted error in the simulations when using such values of degree and order to propagate the spacecraft trajectory and control. This analysis was done using the standard deviation that characterizes the uncertainty for each one of the values of the degree and order used in LP100K model for the satellite orbit. With knowledge of the uncertainty of the gravity model adopted, lunar orbital trajectory simulations may be accomplished considering these values of uncertainty. Furthermore, it was also used a Kalman filter, where is considered the sensor's uncertainty that defines the satellite position at each step of the simulation and the uncertainty of the model, by means of the characteristic variance of the truncated gravity model. Thus, this procedure represents an effort to approximate the results obtained using lower values for the degree and order of the spherical harmonics, to the results that would be attained if the maximum accuracy of the model LP100K were adopted. Also a comparison is made between the error in the satellite position in the situation in which the Kalman filter is used and the situation in which the filter is not used. The data for the comparison were obtained from the standard deviation in the velocity increment of the space vehicle. (paper)
DEFF Research Database (Denmark)
Diky, Vladimir; Chirico, Robert D.; Muzny, Chris
ThermoData Engine (TDE, NIST Standard Reference Databases 103a and 103b) is the first product that implements the concept of Dynamic Data Evaluation in the fields of thermophysics and thermochemistry, which includes maintaining the comprehensive and up-to-date database of experimentally measured ...... uncertainties, curve deviations, and inadequacies of the models. Uncertainty analysis shows relative contributions to the total uncertainty from each component and pair of components....
International Nuclear Information System (INIS)
Baum, C; Alber, M; Birkner, M; Nuesslin, F
2004-01-01
Geometric uncertainties arise during treatment planning and treatment and mean that dose-dependent parameters such as EUD are random variables with a patient specific probability distribution. Treatment planning with highly conformal treatment techniques such as intensity modulated radiation therapy requires new evaluation tools which allow us to estimate this influence of geometrical uncertainties on the probable treatment dose for a planned dose distribution. Monte Carlo simulations of treatment courses with recalculation of the dose according to the daily geometric errors are a gold standard for such an evaluation. Distribution histograms which show the relative frequency of a treatment quality parameter in the treatment simulations can be used to evaluate the potential risks and chances of a planned dose distribution. As treatment simulations with dose recalculation are very time consuming for sufficient statistical accuracy, it is proposed to do treatment simulations in the dose parameter space where the result is mainly determined by the systematic and random component of the geometrical uncertainties. Comparison of the parameter space simulation method with the gold standard for prostate cases and a head and neck case shows good agreement as long as the number of fractions is high enough and the influence of tissue inhomogeneities and surface curvature on the dose is small
Directory of Open Access Journals (Sweden)
C. Mesado
2012-01-01
Full Text Available In nuclear safety analysis, it is very important to be able to simulate the different transients that can occur in a nuclear power plant with a very high accuracy. Although the best estimate codes can simulate the transients and provide realistic system responses, the use of nonexact models, together with assumptions and estimations, is a source of uncertainties which must be properly evaluated. This paper describes a Rod Ejection Accident (REA simulated using the coupled code RELAP5/PARCSv2.7 with a perturbation on the cross-sectional sets in order to determine the uncertainties in the macroscopic neutronic information. The procedure to perform the uncertainty and sensitivity (U&S analysis is a sampling-based method which is easy to implement and allows different procedures for the sensitivity analyses despite its high computational time. DAKOTA-Jaguar software package is the selected toolkit for the U&S analysis presented in this paper. The size of the sampling is determined by applying the Wilks’ formula for double tolerance limits with a 95% of uncertainty and with 95% of statistical confidence for the output variables. Each sample has a corresponding set of perturbations that will modify the cross-sectional sets used by PARCS. Finally, the intervals of tolerance of the output variables will be obtained by the use of nonparametric statistical methods.
Directory of Open Access Journals (Sweden)
Ali Mohtashami
2013-01-01
Full Text Available Decision making on making/buying problem has always been a challenge to decision makers. In this paper a methodology has been proposed to resolve this challenge. This methodology is capable of evaluating making/buying decision making under uncertainty. For uncertainty, the fuzzy logic and simulation approaches have been used. The proposed methodology can be applied to parts with multi stage manufacturing processes and different suppliers. Therefore this methodology provides a scale for decision making from full outsourcing to full manufacturing and with selecting appropriate supplier.
Chen, X.; Huang, G.
2017-12-01
In recent years, distributed hydrological models have been widely used in storm water management, water resources protection and so on. Therefore, how to evaluate the uncertainty of the model reasonably and efficiently becomes a hot topic today. In this paper, the soil and water assessment tool (SWAT) model is constructed for the study area of China's Feilaixia watershed, and the uncertainty of the runoff simulation is analyzed by GLUE method deeply. Taking the initial parameter range of GLUE method as the research core, the influence of different initial parameter ranges on model uncertainty is studied. In this paper, two sets of parameter ranges are chosen as the object of study, the first one (range 1) is recommended by SWAT-CUP and the second one (range 2) is calibrated by SUFI-2. The results showed that under the same number of simulations (10,000 times), the overall uncertainty obtained by the range 2 is less than the range 1. Specifically, the "behavioral" parameter sets for the range 2 is 10000 and for the range 1 is 4448. In the calibration and the validation, the ratio of P-factor to R-factor for range 1 is 1.387 and 1.391, and for range 2 is 1.405 and 1.462 respectively. In addition, the simulation result of range 2 is better with the NS and R2 slightly higher than range 1. Therefore, it can be concluded that using the parameter range calibrated by SUFI-2 as the initial parameter range for the GLUE is a way to effectively capture and evaluate the simulation uncertainty.
DEFF Research Database (Denmark)
Breinholt, Anders; Grum, Morten; Madsen, Henrik
2013-01-01
to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction...... rain inputs and more accurate flow observations to reduce parameter and model simulation uncertainty. © Author(s) 2013....
When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems
Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz
2015-03-01
Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions
International Nuclear Information System (INIS)
Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh
2011-01-01
The paper describes the propagation of void fraction uncertainty, as quantified by employing a novel methodology developed at Paul Scherrer Institut, in the RETRAN-3D simulation of the Peach Bottom turbine trip test. Since the transient considered is characterized by a strong coupling between thermal-hydraulics and neutronics, the accuracy in the void fraction model has a very important influence on the prediction of the power history and, in particular, of the maximum power reached. It has been shown that the objective measures used for the void fraction uncertainty, based on the direct comparison between experimental and predicted values extracted from a database of appropriate separate-effect tests, provides power uncertainty bands that are narrower and more realistic than those based, for example, on expert opinion. The applicability of such an approach to best estimate, nuclear power plant transient analysis has thus been demonstrated.
Hu, Ming-Che
Optimization and simulation are popular operations research and systems analysis tools for energy policy modeling. This dissertation addresses three important questions concerning the use of these tools for energy market (and electricity market) modeling and planning under uncertainty. (1) What is the value of information and cost of disregarding different sources of uncertainty for the U.S. energy economy? (2) Could model-based calculations of the performance (social welfare) of competitive and oligopolistic market equilibria be optimistically biased due to uncertainties in objective function coefficients? (3) How do alternative sloped demand curves perform in the PJM capacity market under economic and weather uncertainty? How does curve adjustment and cost dynamics affect the capacity market outcomes? To address the first question, two-stage stochastic optimization is utilized in the U.S. national MARKAL energy model; then the value of information and cost of ignoring uncertainty are estimated for three uncertainties: carbon cap policy, load growth and natural gas prices. When an uncertainty is important, then explicitly considering those risks when making investments will result in better performance in expectation (positive expected cost of ignoring uncertainty). Furthermore, eliminating the uncertainty would improve strategies even further, meaning that improved forecasts of future conditions are valuable ( i.e., a positive expected value of information). Also, the value of policy coordination shows the difference between a strategy developed under the incorrect assumption of no carbon cap and a strategy correctly anticipating imposition of such a cap. For the second question, game theory models are formulated and the existence of optimistic (positive) biases in market equilibria (both competitive and oligopoly markets) are proved, in that calculated social welfare and producer profits will, in expectation, exceed the values that will actually be received
Energy Technology Data Exchange (ETDEWEB)
Wu, Xu, E-mail: xuwu2@illinois.edu; Kozlowski, Tomasz
2017-03-15
Modeling and simulations are naturally augmented by extensive Uncertainty Quantification (UQ) and sensitivity analysis requirements in the nuclear reactor system design, in which uncertainties must be quantified in order to prove that the investigated design stays within acceptance criteria. Historically, expert judgment has been used to specify the nominal values, probability density functions and upper and lower bounds of the simulation code random input parameters for the forward UQ process. The purpose of this paper is to replace such ad-hoc expert judgment of the statistical properties of input model parameters with inverse UQ process. Inverse UQ seeks statistical descriptions of the model random input parameters that are consistent with the experimental data. Bayesian analysis is used to establish the inverse UQ problems based on experimental data, with systematic and rigorously derived surrogate models based on Polynomial Chaos Expansion (PCE). The methods developed here are demonstrated with the Point Reactor Kinetics Equation (PRKE) coupled with lumped parameter thermal-hydraulics feedback model. Three input parameters, external reactivity, Doppler reactivity coefficient and coolant temperature coefficient are modeled as uncertain input parameters. Their uncertainties are inversely quantified based on synthetic experimental data. Compared with the direct numerical simulation, surrogate model by PC expansion shows high efficiency and accuracy. In addition, inverse UQ with Bayesian analysis can calibrate the random input parameters such that the simulation results are in a better agreement with the experimental data.
Directory of Open Access Journals (Sweden)
Magali Troin
2015-11-01
Full Text Available An analysis of hydrological response to a multi-model approach based on an ensemble of seven snow models (SM; degree-day and mixed degree-day/energy balance models coupled with three hydrological models (HM is presented for a snowmelt-dominated basin in Canada. The present study aims to compare the performance and the reliability of different types of SM-HM combinations at simulating snowmelt flows over the 1961–2000 historical period. The multi-model approach also allows evaluating the uncertainties associated with the structure of the SM-HM ensemble to better predict river flows in Nordic environments. The 20-year calibration shows a satisfactory performance of the ensemble of 21 SM-HM combinations at simulating daily discharges and snow water equivalents (SWEs, with low streamflow volume biases. The validation of the ensemble of 21 SM-HM combinations is conducted over a 20-year period. Performances are similar to the calibration in simulating the daily discharges and SWEs, again with low model biases for streamflow. The spring-snowmelt-generated peak flow is captured only in timing by the ensemble of 21 SM-HM combinations. The results of specific hydrologic indicators show that the uncertainty related to the choice of the given HM in the SM-HM combinations cannot be neglected in a more quantitative manner in simulating snowmelt flows. The selection of the SM plays a larger role than the choice of the SM approach (degree-day versus mixed degree-day/energy balance in simulating spring flows. Overall, the snow models provide a low degree of uncertainty to the total uncertainty in hydrological modeling for snow hydrology studies.
Catastrophic antiphospholipid Syndrome
International Nuclear Information System (INIS)
Medina Velasquez, Yimy; Felix Restrepo Suarez, Jose; Iglesias Gamarra, Antonio
2001-01-01
The antiphospholipid syndrome (APS) is characterized by venous, arterial thrombosis and miscarriages along with lupic anticoagulant and antibodies against anticardiolipin. The catastrophic antiphospholipid syndrome (CAPS) has been described since 1992 like a multiple organic dysfunction caused by multiple vascular thrombosis in three or more organs. The patients who suffer from this syndrome may have or not history of APS. There are two or three mechanisms that may cause the CAPS, alone or in combination: These are: 1. The multisystemic thrombotic disease with emphasis in microvasculature occlusion of the organs and occlusion of big arterial or veins 2. The disseminated intravascular coagulation (DIC) superimpose in 15% to 50% of the patients that, of course, conducted to an occlusive disease of arterioles, veins or capillaries. 3. A systemic inflammatory response syndrome (SIRS) induced by citoquines. In this review it is described clinical and laboratory features, pathogenesis and treatment of CAPS. For this purpose, it was searched for Medline from 1993 to 2000 and revised the most significant issues obtained by this medium
Whitehead, James Joshua
The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in
Energy Technology Data Exchange (ETDEWEB)
Domino, Stefan Paul; Figueroa, Victor G.; Romero, Vicente Jose; Glaze, David Jason; Sherman, Martin P.; Luketa-Hanlin, Anay Josephine
2009-12-01
The objective of this work is to perform an uncertainty quantification (UQ) and model validation analysis of simulations of tests in the cross-wind test facility (XTF) at Sandia National Laboratories. In these tests, a calorimeter was subjected to a fire and the thermal response was measured via thermocouples. The UQ and validation analysis pertains to the experimental and predicted thermal response of the calorimeter. The calculations were performed using Sierra/Fuego/Syrinx/Calore, an Advanced Simulation and Computing (ASC) code capable of predicting object thermal response to a fire environment. Based on the validation results at eight diversely representative TC locations on the calorimeter the predicted calorimeter temperatures effectively bound the experimental temperatures. This post-validates Sandia's first integrated use of fire modeling with thermal response modeling and associated uncertainty estimates in an abnormal-thermal QMU analysis.
Salmi, Tiina; Marchevsky, Maxim; Bajas, Hugo; Felice, Helene; Stenvall, Antti
2015-01-01
The quench protection of superconducting high-field accelerator magnets is presently based on protection heaters, which are activated upon quench detection to accelerate the quench propagation within the winding. Estimations of the heater delay to initiate a normal zone in the coil are essential for the protection design. During the development of Nb3Sn magnets for the LHC luminosity upgrade, protection heater delays have been measured in several experiments, and a new computational tool CoHDA (Code for Heater Delay Analysis) has been developed for heater design. Several computational quench analyses suggest that the efficiency of the present heater technology is on the borderline of protecting the magnets. Quantifying the inevitable uncertainties related to the measured and simulated delays is therefore of pivotal importance. In this paper, we analyze the uncertainties in the heater delay measurements and simulations using data from five impregnated high-field Nb3Sn magnets with different heater geometries. ...
DEFF Research Database (Denmark)
Hiller, Jochen; Reindl, Leonard M
2012-01-01
into account the main error sources for the measurement. This method has the potential to deal with all kinds of systematic and random errors that influence a dimensional CT measurement. A case study demonstrates the practical application of the VCT simulator using numerically generated CT data and statistical......The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...
Salmi, Tiina; Marchevsky, Maxim; Bajas, Hugo; Felice, Helene; Stenvall, Antti
2015-01-01
The quench protection of superconducting high-field accelerator magnets is presently based on protection heaters, which are activated upon quench detection to accelerate the quench propagation within the winding. Estimations of the heater delay to initiate a normal zone in the coil are essential for the protection design. During the development of Nb$_{3}$Sn magnets for the LHC luminosity upgrade, protection heater delays have been measured in several experiments, and a new computational tool CoHDA (Code for Heater Delay Analysis) has been developed for heater design. Several computational quench analyses suggest that the efficiency of the present heater technology is on the borderline of protecting the magnets. Quantifying the inevitable uncertainties related to the measured and simulated delays is therefore of pivotal importance. In this paper, we analyze the uncertainties in the heater delay measurements and simulations using data from five impregnated high-field Nb$_{3}$Sn magnets with different heater ge...
Prudencio, Ernesto E.; Schulz, Karl W.
2012-01-01
QUESO is a collection of statistical algorithms and programming constructs supporting research into the uncertainty quantification (UQ) of models and their predictions. It has been designed with three objectives: it should (a) be sufficiently
Tactical Decision Making under Categorical Uncertainty with Applications to Modeling and Simulation
National Research Council Canada - National Science Library
Kemmerer, Kacey E
2008-01-01
...) and individual differences affect response time in decision-making tasks. The researchers elicited real-world tactical scenarios from veterans of Operation Enduring Freedom and Operation Iraqi Freedom in which uncertainty was present...
International Nuclear Information System (INIS)
Gomes, Daniel S.; Teixeira, Antonio S.
2017-01-01
Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)
Energy Technology Data Exchange (ETDEWEB)
Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)
2017-07-01
Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)
Catastrophic Failure and Critical Scaling Laws of Fiber Bundle Material
Directory of Open Access Journals (Sweden)
Shengwang Hao
2017-05-01
Full Text Available This paper presents a spring-fiber bundle model used to describe the failure process induced by energy release in heterogeneous materials. The conditions that induce catastrophic failure are determined by geometric conditions and energy equilibrium. It is revealed that the relative rates of deformation of, and damage to the fiber bundle with respect to the boundary controlling displacement ε0 exhibit universal power law behavior near the catastrophic point, with a critical exponent of −1/2. The proportion of the rate of response with respect to acceleration exhibits a linear relationship with increasing displacement in the vicinity of the catastrophic point. This allows for the prediction of catastrophic failure immediately prior to failure by extrapolating the trajectory of this relationship as it asymptotes to zero. Monte Carlo simulations are completed and these two critical scaling laws are confirmed.
Hébrard, Eric; Carrasco, Nathalie; Dobrijevic, Michel; Pernot, Pascal
Ion Neutral Mass Spectrometer (INMS) aboard Cassini revealed a rich coupled ion-neutral chemistry in the ionosphere, producing heavy hydrocarbons and nitriles ions. The modeling of such a complex environment is challenging, as it requires a detailed and accurate description of the different relevant processes such as photodissociation cross sections and neutral-neutral reaction rates on one hand, and ionisation cross sections, ion-molecule and recombination reaction rates on the other hand. Underpinning models calculations, each of these processes is parameterized by kinetic constants which, when known, have been studied experimentally and/or theoretically over a range of temperatures and pressures that are most often not representative of Titan's atmosphere. The sizeable experimental and theoretical uncertainties reported in the literature merge therefore with the uncertainties resulting subsequently from the unavoidable estimations or extrapolations to Titan's atmosphere conditions. Such large overall uncertainties have to be accounted for in all resulting inferences most of all to evaluate the quality of the model definition. We have undertaken a systematic study of the uncertainty sources in the simulation of ion mass spectra as recorded by Cassini/INMS in Titan ionosphere during the T5 flyby at 1200 km. Our simulated spectra seem much less affected by the uncertainties on ion-molecule reactions than on neutral-neutral reactions. Photochemical models of Titan's atmosphere are indeed so poorly predictive at high altitudes, in the sense that their computed predictions display such large uncertainties, that we found them to give rise to bimodal and hypersensitive abundance distributions for some major compounds like acetylene C2 H2 and ethylene C2 H4 . We will show to what extent global uncertainty and sensitivity analysis enabled us to identify the causes of this bimodality and to pinpoint the key processes that mostly contribute to limit the accuracy of the
"But it might be a heart attack" : intolerance of uncertainty and panic disorder symptoms
Carleton, R Nicholas; Duranceau, Sophie; Freeston, Mark H; Boelen, Paul A|info:eu-repo/dai/nl/174011954; McCabe, Randi E; Antony, Martin M
Panic disorder models describe interactions between feared anxiety-related physical sensations (i.e., anxiety sensitivity; AS) and catastrophic interpretations therein. Intolerance of uncertainty (IU) has been implicated as necessary for catastrophic interpretations in community samples. The current
Gething, Peter W; Patil, Anand P; Hay, Simon I
2010-04-01
Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.
International Nuclear Information System (INIS)
Schaaf, K.; Trambauer, K.
1999-01-01
The findings of the TMI-2 post-accident analyses indicated that internal cooling mechanisms may have a considerable potential to sustain the vessel integrity after a relocation of core material to the lower plenum, provided that water is continuously available in the RPV. Numerous analytical and experimental research activities are currently underway in this respect. This paper illustrates some major findings of the experimental work on internal cooling mechanisms and describes the limitations and the uncertainties in the simulation of the heat transfer processes. Reference is made especially to the joint German DEBRIS/ RPV research program, which encompasses the experimental investigation of the thermal-hydraulics in gaps, of the heat transfer within a particulate debris bed, and of the high temperature performance of vessel steel, as well as the development of simulation models for the heat transfer in the lower head and the structural response of the RPV. In particular, the results of uncertainty and sensitivity analyses are presented, which have been carried out at GRS using an integral model that describes the major phenomena governing the long-term integrity of the reactor vessel. The investigation of a large-scale relocation indicated that the verification of a gap cooling mechanism as an inherent mechanism is questionable in terms of a stringent probabilistic uncertainty criterion, as long as the formation of a large molten pool cannot be excluded. (author)
Energy Technology Data Exchange (ETDEWEB)
Smith, P.J.; Eddings, E.G.; Ring, T.; Thornock, J.; Draper, T.; Isaac, B.; Rezeai, D.; Toth, P.; Wu, Y.; Kelly, K.
2014-08-01
The objective of this task is to produce predictive capability with quantified uncertainty bounds for the heat flux in commercial-scale, tangentially fired, oxy-coal boilers. Validation data came from the Alstom Boiler Simulation Facility (BSF) for tangentially fired, oxy-coal operation. This task brings together experimental data collected under Alstom’s DOE project for measuring oxy-firing performance parameters in the BSF with this University of Utah project for large eddy simulation (LES) and validation/uncertainty quantification (V/UQ). The Utah work includes V/UQ with measurements in the single-burner facility where advanced strategies for O2 injection can be more easily controlled and data more easily obtained. Highlights of the work include: • Simulations of Alstom’s 15 megawatt (MW) BSF, exploring the uncertainty in thermal boundary conditions. A V/UQ analysis showed consistency between experimental results and simulation results, identifying uncertainty bounds on the quantities of interest for this system (Subtask 9.1) • A simulation study of the University of Utah’s oxy-fuel combustor (OFC) focused on heat flux (Subtask 9.2). A V/UQ analysis was used to show consistency between experimental and simulation results. • Measurement of heat flux and temperature with new optical diagnostic techniques and comparison with conventional measurements (Subtask 9.3). Various optical diagnostics systems were created to provide experimental data to the simulation team. The final configuration utilized a mid-wave infrared (MWIR) camera to measure heat flux and temperature, which was synchronized with a high-speed, visible camera to utilize two-color pyrometry to measure temperature and soot concentration. • Collection of heat flux and temperature measurements in the University of Utah’s OFC for use is subtasks 9.2 and 9.3 (Subtask 9.4). Several replicates were carried to better assess the experimental error. Experiments were specifically designed for the
Grum, M.; Aalderink, R.H.
1999-01-01
The rerun periods of detrimental effects ate often used as design criteria in urban storm water management. Considerable uncertainty is associated with the models used. This is either ignored or pooled with the inherent event to event variation such as rainfall depth It is here argued that
Directory of Open Access Journals (Sweden)
Yongnan Zhu
2017-06-01
Full Text Available The performances of hydrological simulations for the Pearl River Basin in China were analysed using the Coupled Land Surface and Hydrological Model System (CLHMS. Three datasets, including East Asia (EA, high-resolution gauge satellite-merged China Merged Precipitation Analysis (CMPA-Daily, and the Asian Precipitation Highly-Resolved Observational Data Integration Towards Evaluation (APHRODITE daily precipitation were used to drive the CLHMS model to simulate daily hydrological processes from 1998 to 2006. The results indicate that the precipitation data was the most important source of uncertainty in the hydrological simulation. The simulated streamflow driven by the CMPA-Daily agreed well with observations, with a Pearson correlation coefficient (PMC greater than 0.70 and an index of agreement (IOA similarity coefficient greater than 0.82 at Liuzhou, Shijiao, and Wuzhou Stations. Comparison of the Nash-Sutcliffe efficiency coefficient (NSE shows that the peak flow simulation ability of CLHMS driven with the CMPA-Daily rainfall is relatively superior to that with the EA and APHRODITE datasets. The simulation results for the high-flow periods in 1998 and 2005 indicate that the CLHMS is promising for its future application in the flood simulation and prediction.
Creating catastrophes in the classroom
Andersson, Thommy
2013-04-01
Buildings, infrastructure and human life are being destroyed by wind and landslides. To interest and motivate pupils and to help them understand abstract knowledge, a practical experiment could be useful. These experiments will show why strong winds circulate around tropical cyclones and how fluvial geological processes affect nature and communities. The experiments are easy to set up and the equipment is not expensive. Experiment 1: Exogenic processes of water are often slow processes. This experiment will simulate water processes that can take thousands of years, in less than 40 minutes. This experiment can be presented for and understood by pupils at all levels. Letting the pupils build up the scenery will make them more curious about the course of events. During that time they will see the geomorphological genesis of landforms such as landslides, sandurs, deltas, canyons sedimentations, selective erosions. Placing small houses, bridges etc. we can lead to discussions about natural catastrophes and community planning. Material needed for the experiment is a water bucket, erosion gutter, clay (simulating rock), sand and smaller pebbles (simulating the soil), houses of "Monopoly" size and tubes. By using a table with wheels it is easy to reuse the result for other lessons. Installation of a pump can make the experiment into a closed loop system. This installation can be used for presentations outside the classroom. Experiment 2: The Coriolis Effect explains why the wind (moving objects) deflects when moving. In the northern hemisphere the deflection is clockwise and anti-clockwise in the southern hemisphere. This abstract effect is often hard for upper secondary pupils to understand. This experiment will show the effect and thus make the theory real and visible. Material needed for this experiment is a bucket, pipes, a string. At my school we had cooperation with pupils from the Industrial Technology programme who made a copper pipe construction. During the
Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H
2016-01-01
The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.
Gnoffo, Peter A.; Berry, Scott A.; VanNorman, John W.
2011-01-01
This paper is one of a series of five papers in a special session organized by the NASA Fundamental Aeronautics Program that addresses uncertainty assessments for CFD simulations in hypersonic flow. Simulations of a shock emanating from a compression corner and interacting with a fully developed turbulent boundary layer are evaluated herein. Mission relevant conditions at Mach 7 and Mach 14 are defined for a pre-compression ramp of a scramjet powered vehicle. Three compression angles are defined, the smallest to avoid separation losses and the largest to force a separated flow engaging more complicated flow physics. The Baldwin-Lomax and the Cebeci-Smith algebraic models, the one-equation Spalart-Allmaras model with the Catrix-Aupoix compressibility modification and two-equation models including Menter SST, Wilcox k-omega 98, and Wilcox k-omega 06 turbulence models are evaluated. Each model is fully defined herein to preclude any ambiguity regarding model implementation. Comparisons are made to existing experimental data and Van Driest theory to provide preliminary assessment of model form uncertainty. A set of coarse grained uncertainty metrics are defined to capture essential differences among turbulence models. Except for the inability of algebraic models to converge for some separated flows there is no clearly superior model as judged by these metrics. A preliminary metric for the numerical component of uncertainty in shock-turbulent-boundary-layer interactions at compression corners sufficiently steep to cause separation is defined as 55%. This value is a median of differences with experimental data averaged for peak pressure and heating and for extent of separation captured in new, grid-converged solutions presented here. This value is consistent with existing results in a literature review of hypersonic shock-turbulent-boundary-layer interactions by Roy and Blottner and with more recent computations of MacLean.
Hosseinzadehtalaei, Parisa; Tabari, Hossein; Willems, Patrick
2018-02-01
An ensemble of 88 regional climate model (RCM) simulations at 0.11° and 0.44° spatial resolutions from the EURO-CORDEX project is analyzed for central Belgium to investigate the projected impact of climate change on precipitation intensity-duration-frequency (IDF) relationships and extreme precipitation quantiles typically used in water engineering designs. The rate of uncertainty arising from the choice of RCM, driving GCM, and radiative concentration pathway (RCP4.5 & RCP8.5) is quantified using a variance decomposition technique after reconstruction of missing data in GCM × RCM combinations. A comparative analysis between the historical simulations of the EURO-CORDEX 0.11° and 0.44° RCMs shows higher precipitation intensities by the finer resolution runs, leading to a larger overestimation of the observations-based IDFs by the 0.11° runs. The results reveal that making a temporal stationarity assumption for the climate system may lead to underestimation of precipitation quantiles up to 70% by the end of this century. This projected increase is generally larger for the 0.11° RCMs compared with the 0.44° RCMs. The relative changes in extreme precipitation do depend on return period and duration, indicating an amplification for larger return periods and for smaller durations. The variance decomposition approach generally identifies RCM as the most dominant component of uncertainty in changes of more extreme precipitation (return period of 10 years) for both 0.11° and 0.44° resolutions, followed by GCM and RCP scenario. The uncertainties associated with cross-contributions of RCMs, GCMs, and RCPs play a non-negligible role in the associated uncertainties of the changes.
Rohmer, Jeremy; Rousseau, Marie; Lemoine, Anne; Pedreros, Rodrigo; Lambert, Jerome; benki, Aalae
2017-04-01
Recent tsunami events including the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami have caused many casualties and damages to structures. Advances in numerical simulation of tsunami-induced wave processes have tremendously improved forecast, hazard and risk assessment and design of early warning for tsunamis. Among the major challenges, several studies have underlined uncertainties in earthquake slip distributions and rupture processes as major contributor on tsunami wave height and inundation extent. Constraining these uncertainties can be performed by taking advantage of observations either on tsunami waves (using network of water level gauge) or on inundation characteristics (using field evidence and eyewitness accounts). Despite these successful applications, combining tsunami observations and simulations still faces several limitations when the problem is addressed for past tsunamis events like 1755 Lisbon. 1) While recent inversion studies can benefit from current modern networks (e.g., tide gauges, sea bottom pressure gauges, GPS-mounted buoys), the number of tide gauges can be very scarce and testimonies on tsunami observations can be limited, incomplete and imprecise for past tsunamis events. These observations often restrict to eyewitness accounts on wave heights (e.g., maximum reached wave height at the coast) instead of the full observed waveforms; 2) Tsunami phenomena involve a large span of spatial scales (from ocean basin scales to local coastal wave interactions), which can make the modelling very demanding: the computation time cost of tsunami simulation can be very prohibitive; often reaching several hours. This often limits the number of allowable long-running simulations for performing the inversion, especially when the problem is addressed from a Bayesian inference perspective. The objective of the present study is to overcome both afore-described difficulties in the view to combine historical observations on past tsunami-induced waves
Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia
2017-10-01
The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.
A simulation study of organizational decision making under conditions of uncertainty and ambiguity .
Athens, Arthur J.
1983-01-01
Approved for public release; distribution in unlimited. The usual frameworks applied to the analysis of military decision making describe the decision process according to the rational model. The assumptions inherent in this model. however, are not consistent with the reality of warfare's inherent uncertainty and complexity. A better model is needed to address the ambiguilty actually confronting the combat commander. The garbage can model of organizational choice, a nonrational approach to...
Energy catastrophes and energy consumption
International Nuclear Information System (INIS)
Davis, G.
1991-01-01
The possibility of energy catastrophes in the production of energy serves to make estimation of the true social costs of energy production difficult. As a result, there is a distinct possibility that the private marginal cost curve of energy producers lies to the left or right of the true cost curve. If so, social welfare will not be maximized, and underconsumption or overconsumption of fuels will exist. The occurrence of energy catastrophes and observance of the market reaction to these occurrences indicates that overconsumption of energy has been the case in the past. Postulations as to market reactions to further energy catastrophes lead to the presumption that energy consumption levels remain above those that are socially optimal
Fu, A.; Xue, Y.
2017-12-01
Corn is one of most important agricultural production in China. Research on the simulation of corn yields and the impacts of climate change and agricultural management practices on corn yields is important in maintaining the stable corn production. After climatic data including daily temperature, precipitation, solar radiation, relative humidity, and wind speed from 1948 to 2010, soil properties, observed corn yields, and farmland management information were collected, corn yields grown in humidity and hot environment (Sichuang province) and cold and dry environment (Hebei province) in China in the past 63 years were simulated by Daycent, and the results was evaluated based on published yield record. The relationship between regional climate change, global warming and corn yield were analyzed, the uncertainties of simulation derived from agricultural management practices by changing fertilization levels, land fertilizer maintenance and tillage methods were reported. The results showed that: (1) Daycent model is capable to simulate corn yields under the different climatic background in China. (2) When studying the relationship between regional climate change and corn yields, it has been found that observed and simulated corn yields increased along with total regional climate change. (3) When studying the relationship between the global warming and corn yields, It was discovered that newly-simulated corn yields after removing the global warming trend of original temperature data were lower than before.
Energy Technology Data Exchange (ETDEWEB)
Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Witkowski, Walter R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2016-04-13
The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationally simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). There were several critical activities that follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. Here, we introduce the subject matter for general applications but specifics are given for the failure prediction project. In addition, the first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.
Energy Technology Data Exchange (ETDEWEB)
Zwermann, Winfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany). Forschungszentrum
2017-12-15
The supplementation of reactor simulations by uncertainty analyses is becoming increasingly important internationally due to the fact that the reliability of simulation calculations can be significantly increased by the quantification of uncertainties in comparison to the use of so-called conservative methods (BEPU- ''Best-Estimate plus Uncertainties''). While systematic uncertainty analyses for thermo-hydraulic calculations have been performed routinely for a long time, methods for taking into account uncertainties in nuclear data, which are the basis for neutron transport calculations, are under development. The Focus Session Uncertainty Analyses in Reactor Core Simulations was intended to provide an overview of international research and development with respect to supplementing reactor core simulations with uncertainty and sensitivity analyses, in research institutes as well as within the nuclear industry. The presented analyses not only focused on light water reactors, but also on advanced reactor systems. Particular emphasis was put on international benchmarks in the field. The session was chaired by Winfried Zwermann (Gesellschaft fuer Anlagen- und Reaktorsicherheit).
Hopfe, C.J.
2009-01-01
Building performance simulation (BPS) uses computer-based models that cover performance aspects such as energy consumption and thermal comfort in buildings. The uptake of BPS in current building design projects is limited. Although there is a large number of building simulation tools available, the
Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations
Energy Technology Data Exchange (ETDEWEB)
Biros, George [Univ. of Texas, Austin, TX (United States)
2018-01-12
Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. These include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a
The impact of possible climate catastrophes on global warming policy
International Nuclear Information System (INIS)
Baranzini, Andrea; Chesney, Marc; Morisset, Jacques
2003-01-01
Recent studies on global warming have introduced the inherent uncertainties associated with the costs and benefits of climate policies and have often shown that abatement policies are likely to be less aggressive or postponed in comparison to those resulting from traditional cost-benefit analyses (CBA). Yet, those studies have failed to include the possibility of sudden climate catastrophes. The aim of this paper is to account simultaneously for possible continuous and discrete damages resulting from global warming, and to analyse their implications on the optimal path of abatement policies. Our approach is related to the new literature on investment under uncertainty, and relies on some recent developments of the real option in which we incorporated negative jumps (climate catastrophes) in the stochastic process corresponding to the net benefits associated with the abatement policies. The impacts of continuous and discrete climatic risks can therefore be considered separately. Our numerical applications lead to two main conclusions: (i) gradual, continuous uncertainty in the global warming process is likely to delay the adoption of abatement policies as found in previous studies, with respect to the standard CBA; however (ii) the possibility of climate catastrophes accelerates the implementation of these policies as their net discounted benefits increase significantly
Prudencio, Ernesto E.
2012-01-01
QUESO is a collection of statistical algorithms and programming constructs supporting research into the uncertainty quantification (UQ) of models and their predictions. It has been designed with three objectives: it should (a) be sufficiently abstract in order to handle a large spectrum of models, (b) be algorithmically extensible, allowing an easy insertion of new and improved algorithms, and (c) take advantage of parallel computing, in order to handle realistic models. Such objectives demand a combination of an object-oriented design with robust software engineering practices. QUESO is written in C++, uses MPI, and leverages libraries already available to the scientific community. We describe some UQ concepts, present QUESO, and list planned enhancements.
Predicting natural catastrophes tsunamis
CERN. Geneva
2005-01-01
1. Tsunamis - Introduction - Definition of phenomenon - basic properties of the waves Propagation and dispersion Interaction with coasts - Geological and societal effects Origin of tsunamis - natural sources Scientific activities in connection with tsunamis. Ideas about simulations 2. Tsunami generation - The earthquake source - conventional theory The earthquake source - normal mode theory The landslide source Near-field observation - The Plafker index Far-field observation - Directivity 3. Tsunami warning - General ideas - History of efforts Mantle magnitudes and TREMOR algorithms The challenge of "tsunami earthquakes" Energy-moment ratios and slow earthquakes Implementation and the components of warning centers 4. Tsunami surveys - Principles and methodologies Fifteen years of field surveys and related milestones. Reconstructing historical tsunamis: eyewitnesses and geological evidence 5. Lessons from the 2004 Indonesian tsunami - Lessons in seismology Lessons in Geology The new technologies Lessons in civ...
Financing Losses from Catastrophic Risks
2008-11-01
often held in the form of bonds, the interest on which is subject to corporate income tax , which reduces the net earnings to each insurer’s shareholders...course; it is a basic feature of the corporate income tax . But, as explained above, catastrophe insurance is distinguished from other types of
Miller, B. W.; Schuurman, G. W.; Symstad, A.; Fisichelli, N. A.; Frid, L.
2017-12-01
Managing natural resources in this era of anthropogenic climate change is fraught with uncertainties around how ecosystems will respond to management actions and a changing climate. Scenario planning (oftentimes implemented as a qualitative, participatory exercise for exploring multiple possible futures) is a valuable tool for addressing this challenge. However, this approach may face limits in resolving responses of complex systems to altered climate and management conditions, and may not provide the scientific credibility that managers often require to support actions that depart from current practice. Quantitative information on projected climate changes and ecological responses is rapidly growing and evolving, but this information is often not at a scale or in a form that is `actionable' for resource managers. We describe a project that sought to create usable information for resource managers in the northern Great Plains by combining qualitative and quantitative methods. In particular, researchers, resource managers, and climate adaptation specialists co-produced a simulation model in conjunction with scenario planning workshops to inform natural resource management in southwest South Dakota. Scenario planning for a wide range of resources facilitated open-minded thinking about a set of divergent and challenging, yet relevant and plausible, climate scenarios and management alternatives that could be implemented in the simulation. With stakeholder input throughout the process, we built a simulation of key vegetation types, grazing, exotic plants, fire, and the effects of climate and management on rangeland productivity and composition. By simulating multiple land management jurisdictions, climate scenarios, and management alternatives, the model highlighted important tradeoffs between herd sizes and vegetation composition, and between the short- versus long-term costs of invasive species management. It also identified impactful uncertainties related to the
Uncertainty versus variability in Monte Carlo simulations of human exposure through food pathways
International Nuclear Information System (INIS)
McKone, T.E.
1994-01-01
An important issue in both the risk characterization and subsequent risk management of contaminated soil is how precisely we can characterize the distribution among individuals of potential doses associated with chemical contaminants in soil and whether this level of precision favors the use of population distributions of exposure over the use of single scenario representations. For lipophilic contaminants, such as dioxins, furans, polychlorinated biphenyls, pesticides, and for metals such as lead and mercury, exposures through food have been demonstrated to be dominant contributors to total dose within non-occupationally exposed populations. However, overall uncertainties in estimating potential doses through food chains are much larger than uncertainties associated with other exposure pathways. A general model is described here for estimating the ratio of potential dose to contaminant concentration in soil for homegrown foods contaminated by lipophilic, nonionic organic chemicals. This model includes parameters describing homegrown food consumption rates, exposure duration, biotransfer factors, and partition factors. For the parameters needed in this model, the mean and variance are often the only moments of the parameter distribution available. Parameters are divided into three categories, uncertain parameters, variable parameters, and mixed uncertain/variable parameters. Using soils contaminated by hexachlorbenzene (HCB) and benzo(a)pyrene (BaP) as cases studies, a stepwise Monte Carlo analysis is used to develop a histogram that apportions variance in the outcome (ratio of potential dose by food pathways to soil concentration) to variance in each of the three input categories. The results represent potential doses in households consuming homegrown foods
Stochastic reservoir simulation for the modeling of uncertainty in coal seam degasification
Karacan, C. Özgen; Olea, Ricardo A.
2015-01-01
Coal seam degasification improves coal mine safety by reducing the gas content of coal seams and also by generating added value as an energy source. Coal seam reservoir simulation is one of the most effective ways to help with these two main objectives. As in all modeling and simulation studies, how the reservoir is defined and whether observed productions can be predicted are important considerations.
Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System
2017-08-01
related to the numerical structuring of a problem, such as cell size, domain extent, and system orientation. Depth of penetration of a threat into a... system in the simulation codes is tied to the domain structure , with coordinate axes aligned with cell edges. However, the position of the coordinate...physical systems are generally described by sets of equations involving continuous variables, such as time and position. Computational simulations
DEFF Research Database (Denmark)
Wang, Weizhi; Wu, Minghao; Palm, Johannes
2018-01-01
for almost linear incident waves. First, we show that the computational fluid dynamics simulations have acceptable agreement to experimental data. We then present a verification and validation study focusing on the solution verification covering spatial and temporal discretization, iterative and domain......The wave loads and the resulting motions of floating wave energy converters are traditionally computed using linear radiation–diffraction methods. Yet for certain cases such as survival conditions, phase control and wave energy converters operating in the resonance region, more complete...... dynamics simulations have largely been overlooked in the wave energy sector. In this article, we apply formal verification and validation techniques to computational fluid dynamics simulations of a passively controlled point absorber. The phase control causes the motion response to be highly nonlinear even...
Adam Duarte,; Hatfield, Jeffrey; Todd M. Swannack,; Michael R. J. Forstner,; M. Clay Green,; Floyd W. Weckerly,
2015-01-01
Population viability analyses provide a quantitative approach that seeks to predict the possible future status of a species of interest under different scenarios and, therefore, can be important components of large-scale species’ conservation programs. We created a model and simulated range-wide population and breeding habitat dynamics for an endangered woodland warbler, the golden-cheeked warbler (Setophaga chrysoparia). Habitat-transition probabilities were estimated across the warbler's breeding range by combining National Land Cover Database imagery with multistate modeling. Using these estimates, along with recently published demographic estimates, we examined if the species can remain viable into the future given the current conditions. Lastly, we evaluated if protecting a greater amount of habitat would increase the number of warblers that can be supported in the future by systematically increasing the amount of protected habitat and comparing the estimated terminal carrying capacity at the end of 50 years of simulated habitat change. The estimated habitat-transition probabilities supported the hypothesis that habitat transitions are unidirectional, whereby habitat is more likely to diminish than regenerate. The model results indicated population viability could be achieved under current conditions, depending on dispersal. However, there is considerable uncertainty associated with the population projections due to parametric uncertainty. Model results suggested that increasing the amount of protected lands would have a substantial impact on terminal carrying capacities at the end of a 50-year simulation. Notably, this study identifies the need for collecting the data required to estimate demographic parameters in relation to changes in habitat metrics and population density in multiple regions, and highlights the importance of establishing a common definition of what constitutes protected habitat, what management goals are suitable within those protected
International Nuclear Information System (INIS)
Boutahar, Jaouad
2004-01-01
In an integrated impact assessment, one has to test several scenarios of the model inputs or/and to identify the effects of model input uncertainties on the model outputs. In both cases, a large number of simulations of the model is necessary. That of course is not feasible with comprehensive Chemistry-Transport Model, due to the need for huge CPU times. Two approaches may be used in order to circumvent these difficulties: The first approach consists in reducing the computational cost of the original model by building a reduced model. Two reduction techniques are used: the first method, POD, is related to the statistical behaviour of the system and is based on a proper orthogonal decomposition of the solutions. The second method, is an efficient representation of the input/output behaviour through look-up tables. It describes the output model as an expansion of finite hierarchical correlated function in terms of the input variables. The second approach is based on reducing the number of models runs required by the standard Monte Carlo methods. It characterizes the probabilistic response of the uncertain model output as an expansion of orthogonal polynomials according to model inputs uncertainties. Then the classical Monte Carlo simulation can easily be used to compute the probability density of the uncertain output. Another key point in an integrated impact assessment is to develop strategies for the reduction of emissions by computing Source/Receptor matrices for several years of simulations. We proposed here an efficient method to calculate these matrices by using the adjoint model and in particular by defining the 'representative chemical day'. All of these methods are applied to POLAIR3D, a Chemistry-Transport model developed in this thesis. (author) [fr
GLOBAL RANDOM WALK SIMULATIONS FOR SENSITIVITY AND UNCERTAINTY ANALYSIS OF PASSIVE TRANSPORT MODELS
Directory of Open Access Journals (Sweden)
Nicolae Suciu
2011-07-01
Full Text Available The Global Random Walk algorithm (GRW performs a simultaneoustracking on a fixed grid of huge numbers of particles at costscomparable to those of a single-trajectory simulation by the traditional Particle Tracking (PT approach. Statistical ensembles of GRW simulations of a typical advection-dispersion process in groundwater systems with randomly distributed spatial parameters are used to obtain reliable estimations of the input parameters for the upscaled transport model and of their correlations, input-output correlations, as well as full probability distributions of the input and output parameters.
Ye, L.; Wu, J.; Wang, L.; Song, T.; Ji, R.
2017-12-01
Flooding in small-scale watershed in hilly area is characterized by short time periods and rapid rise and recession due to the complex underlying surfaces, various climate type and strong effect of human activities. It is almost impossible for a single hydrological model to describe the variation of flooding in both time and space accurately for all the catchments in hilly area because the hydrological characteristics can vary significantly among different catchments. In this study, we compare the performance of 5 hydrological models with varying degrees of complexity for simulation of flash flood for 14 small-scale watershed in China in order to find the relationship between the applicability of the hydrological models and the catchments characteristics. Meanwhile, given the fact that the hydrological data is sparse in hilly area, the effect of precipitation data, DEM resolution and their interference on the uncertainty of flood simulation is also illustrated. In general, the results showed that the distributed hydrological model (HEC-HMS in this study) performed better than the lumped hydrological models. Xinajiang and API models had good simulation for the humid catchments when long-term and continuous rainfall data is provided. Dahuofang model can simulate the flood peak well while the runoff generation module is relatively poor. In addition, the effect of diverse modelling data on the simulations is not simply superposed, and there is a complex interaction effect among different modelling data. Overall, both the catchment hydrological characteristics and modelling data situation should be taken into consideration in order to choose the suitable hydrological model for flood simulation for small-scale catchment in hilly area.
Confronting uncertainty in model-based geostatistics using Markov Chain Monte Carlo simulation
Minasny, B.; Vrugt, J.A.; McBratney, A.B.
2011-01-01
This paper demonstrates for the first time the use of Markov Chain Monte Carlo (MCMC) simulation for parameter inference in model-based soil geostatistics. We implemented the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to jointly summarize the posterior
Torrealba, Victor A.; Hoteit, Hussein; Chawathe, Adwait
2017-01-01
and thermodynamic phase split, the impact of grid downscaling on CEOR simulations is not well understood. In this work, we introduce a geostatistical downscaling method conditioned to tracer data to refine a coarse history-matched WF model. This downscaling process
Vrugt, J.A.; Braak, ter C.J.F.; Clark, M.P.; Hyman, J.M.; Robinson, B.A.
2008-01-01
There is increasing consensus in the hydrologic literature that an appropriate framework for streamflow forecasting and simulation should include explicit recognition of forcing and parameter and model structural error. This paper presents a novel Markov chain Monte Carlo (MCMC) sampler, entitled
Structural Uncertainty in Model-Simulated Trends of Global Gross Primary Production
Directory of Open Access Journals (Sweden)
Zaichun Zhu
2013-03-01
Full Text Available Projected changes in the frequency and severity of droughts as a result of increase in greenhouse gases have a significant impact on the role of vegetation in regulating the global carbon cycle. Drought effect on vegetation Gross Primary Production (GPP is usually modeled as a function of Vapor Pressure Deficit (VPD and/or soil moisture. Climate projections suggest a strong likelihood of increasing trend in VPD, while regional changes in precipitation are less certain. This difference in projections between VPD and precipitation can cause considerable discrepancies in the predictions of vegetation behavior depending on how ecosystem models represent the drought effect. In this study, we scrutinized the model responses to drought using the 30-year record of Global Inventory Modeling and Mapping Studies (GIMMS 3g Normalized Difference Vegetation Index (NDVI dataset. A diagnostic ecosystem model, Terrestrial Observation and Prediction System (TOPS, was used to estimate global GPP from 1982 to 2009 under nine different experimental simulations. The control run of global GPP increased until 2000, but stayed constant after 2000. Among the simulations with single climate constraint (temperature, VPD, rainfall and solar radiation, only the VPD-driven simulation showed a decrease in 2000s, while the other scenarios simulated an increase in GPP. The diverging responses in 2000s can be attributed to the difference in the representation of the impact of water stress on vegetation in models, i.e., using VPD and/or precipitation. Spatial map of trend in simulated GPP using GIMMS 3g data is consistent with the GPP driven by soil moisture than the GPP driven by VPD, confirming the need for a soil moisture constraint in modeling global GPP.
Energy Technology Data Exchange (ETDEWEB)
Shahnam, Mehrdad [National Energy Technology Lab. (NETL), Morgantown, WV (United States). Research and Innovation Center, Energy Conversion Engineering Directorate; Gel, Aytekin [ALPEMI Consulting, LLC, Phoeniz, AZ (United States); Subramaniyan, Arun K. [GE Global Research Center, Niskayuna, NY (United States); Musser, Jordan [National Energy Technology Lab. (NETL), Morgantown, WV (United States). Research and Innovation Center, Energy Conversion Engineering Directorate; Dietiker, Jean-Francois [West Virginia Univ. Research Corporation, Morgantown, WV (United States)
2017-10-02
Adequate assessment of the uncertainties in modeling and simulation is becoming an integral part of the simulation based engineering design. The goal of this study is to demonstrate the application of non-intrusive Bayesian uncertainty quantification (UQ) methodology in multiphase (gas-solid) flows with experimental and simulation data, as part of our research efforts to determine the most suited approach for UQ of a bench scale fluidized bed gasifier. UQ analysis was first performed on the available experimental data. Global sensitivity analysis performed as part of the UQ analysis shows that among the three operating factors, steam to oxygen ratio has the most influence on syngas composition in the bench-scale gasifier experiments. An analysis for forward propagation of uncertainties was performed and results show that an increase in steam to oxygen ratio leads to an increase in H2 mole fraction and a decrease in CO mole fraction. These findings are in agreement with the ANOVA analysis performed in the reference experimental study. Another contribution in addition to the UQ analysis is the optimization-based approach to guide to identify next best set of additional experimental samples, should the possibility arise for additional experiments. Hence, the surrogate models constructed as part of the UQ analysis is employed to improve the information gain and make incremental recommendation, should the possibility to add more experiments arise. In the second step, series of simulations were carried out with the open-source computational fluid dynamics software MFiX to reproduce the experimental conditions, where three operating factors, i.e., coal flow rate, coal particle diameter, and steam-to-oxygen ratio, were systematically varied to understand their effect on the syngas composition. Bayesian UQ analysis was performed on the numerical results. As part of Bayesian UQ analysis, a global sensitivity analysis was performed based on the simulation results, which shows
Laboratory tests of catastrophic disruption of rotating bodies
Morris, A. J. W.; Burchell, M. J.
2017-11-01
The results of catastrophic disruption experiments on static and rotating targets are reported. The experiments used cement spheres of diameter 10 cm as the targets. Impacts were by mm sized stainless steel spheres at speeds of between 1 and 7.75 km s-1. Energy densities (Q) in the targets ranged from 7 to 2613 J kg-1. The experiments covered both the cratering and catastrophic disruption regimes. For static, i.e. non-rotating targets the critical energy density for disruption (Q*, the value of Q when the largest surviving target fragment has a mass equal to one half of the pre-impact target mass) was Q* = 1447 ± 90 J kg-1. For rotating targets (median rotation frequency of 3.44 Hz) we found Q* = 987 ± 349 J kg-1, a reduction of 32% in the mean value. This lower value of Q* for rotating targets was also accompanied by a larger scatter on the data, hence the greater uncertainty. We suggest that in some cases the rotating targets behaved as static targets, i.e. broke up with the same catastrophic disruption threshold, but in other cases the rotation helped the break up causing a lower catastrophic disruption threshold, hence both the lower value of Q* and the larger scatter on the data. The fragment mass distributions after impact were similar in both the static and rotating target experiments with similar slopes.
Strategic reasoning and bargaining in catastrophic climate change games
Verendel, Vilhelm; Johansson, Daniel J. A.; Lindgren, Kristian
2016-03-01
Two decades of international negotiations show that agreeing on emission levels for climate change mitigation is a hard challenge. However, if early warning signals were to show an upcoming tipping point with catastrophic damage, theory and experiments suggest this could simplify collective action to reduce greenhouse gas emissions. At the actual threshold, no country would have a free-ride incentive to increase emissions over the tipping point, but it remains for countries to negotiate their emission levels to reach these agreements. We model agents bargaining for emission levels using strategic reasoning to predict emission bids by others and ask how this affects the possibility of reaching agreements that avoid catastrophic damage. It is known that policy elites often use a higher degree of strategic reasoning, and in our model this increases the risk for climate catastrophe. Moreover, some forms of higher strategic reasoning make agreements to reduce greenhouse gases unstable. We use empirically informed levels of strategic reasoning when simulating the model.
Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J
2017-06-01
In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision
Directory of Open Access Journals (Sweden)
О. A. Tereshchenko
2017-06-01
Full Text Available Purpose. The article highlights development of the methodological basis for simulation the processes of cars accumulation in solving operational planning problems under conditions of initial information uncertainty for assessing the sustainability of the adopted planning scenario and calculating the associated technological risks. Methodology. The solution of the problem under investigation is based on the use of general scientific approaches, the apparatus of probability theory and the theory of fuzzy sets. To achieve this purpose, the factors influencing the entropy of operational plans are systematized. It is established that when planning the operational work of railway stations, sections and nodes, the most significant factors that cause uncertainty in the initial information are: a external conditions with respect to the railway ground in question, expressed by the uncertainty of the timing of cars arrivals; b external, hard-to-identify goals for the functioning of other participants in the logistics chain (primarily customers, expressed by the uncertainty of the completion time with the freight cars. These factors are suggested to be taken into account in automated planning through statistical analysis – the establishment and study of the remaining time (prediction errors. As a result, analytical dependencies are proposed for rational representation of the probability density functions of the time residual distribution in the form of point, piecewise-defined and continuous analytic models. The developed models of cars accumulation, the application of which depends on the identified states of the predicted incoming car flow to the accumulation system, are presented below. In addition, the last proposed model is a general case of models of accumulation processes with an arbitrary level of reliability of the initial information for any structure of the incoming flow of cars. In conclusion, a technique for estimating the results of
International Nuclear Information System (INIS)
Chungcharoen, E.
1997-01-01
A model was developed to help determine the future development of hydrocarbon reserves. The uncertainties of geological parameters were incorporated into the model in an effort to provide an assessment of the distribution of total hydrocarbon discoveries that are expected to be recovered as a result of exploration activity. The economic parameters were also incorporated into the model in an effort to determine the economic worth of multiple-wells exploration activity. The first part of this study included the geological parameters in the initial field size distribution and the number of fields distribution. Dry hole data was also considered to reflect the exploration risk. The distribution of total hydrocarbon discoveries for a selected number of exploratory wells was determined. The second part of the study included the economic parameters such as the price of oil and gas and the cost of exploration, development and production. The distribution of the number of discoveries and the distribution of total hydrocarbon discoveries was compared to produce a probability distribution of the net present value of a proposed exploration program. The offshore Nova Scotia Shelf basin was chosen for testing the methodology. Several scenarios involving changes in economic parameters were shown. This methodology could help in determining future development programs for hydrocarbon reserves. The methodology can also help governments in policy making decisions regarding taxes and royalty regimes for exploration programs
Reactor accidents and nuclear catastrophes
International Nuclear Information System (INIS)
Kirchhoff, R.; Linde, H.J.
1979-01-01
Assuming some preliminary knowledge of the fundamentals of atomic physics, the book describes the effects of ionizing radiation on the human organism. In order to assess the potential hazards of reactor accidents and the extent of a nuclear catastrophe, the technology of power generation in nuclear power stations is presented together with its potential dangers as well as the physical and medical processes occurring during a nuclear weapons explosion. The special medical aspects are presented which range from first aid in the case of a catastrophe to the accute radiation syndrome, the treatment of burns to the therapy of late radiolesions. Finally, it is confirmed that the treatment of radiation injured persons does not give rise to basically new medical problems. (orig./HP) [de
Takagawa, T.
2016-12-01
An ensemble forecasting scheme for tsunami inundation is presented. The scheme consists of three elemental methods. The first is a hierarchical Bayesian inversion using Akaike's Bayesian Information Criterion (ABIC). The second is Montecarlo sampling from a probability density function of multidimensional normal distribution. The third is ensamble analysis of tsunami inundation simulations with multiple tsunami sources. Simulation based validation of the model was conducted. A tsunami scenario of M9.1 Nankai earthquake was chosen as a target of validation. Tsunami inundation around Nagoya Port was estimated by using synthetic tsunami waveforms at offshore GPS buoys. The error of estimation of tsunami inundation area was about 10% even if we used only ten minutes observation data. The estimation accuracy of waveforms on/off land and spatial distribution of maximum tsunami inundation depth is demonstrated.
International Nuclear Information System (INIS)
Mavrotas, George; Florios, Kostas; Vlachou, Dimitra
2010-01-01
For more than 40 years, Mathematical Programming is the traditional tool for energy planning at the national or regional level aiming at cost minimization subject to specific technological, political and demand satisfaction constraints. The liberalization of the energy market along with the ongoing technical progress increased the level of competition and forced energy consumers, even at the unit level, to make their choices among a large number of alternative or complementary energy technologies, fuels and/or suppliers. In the present work we develop a modelling framework for energy planning in units of the tertiary sector giving special emphasis to model reduction and to the uncertainty of the economic parameters. In the given case study, the energy rehabilitation of a hospital in Athens is examined and the installation of a cogeneration, absorption and compression unit is examined for the supply of the electricity, heating and cooling load. The basic innovation of the given energy model lies in the uncertainty modelling through the combined use of Mathematical Programming (namely, Mixed Integer Linear Programming, MILP) and Monte Carlo simulation that permits the risk management for the most volatile parameters of the objective function such as the fuel costs and the interest rate. The results come in the form of probability distributions that provide fruitful information to the decision maker. The effect of model reduction through appropriate data compression of the load data is also addressed.
Izadi, Arman; Kimiagari, Ali Mohammad
2014-05-01
Distribution network design as a strategic decision has long-term effect on tactical and operational supply chain management. In this research, the location-allocation problem is studied under demand uncertainty. The purposes of this study were to specify the optimal number and location of distribution centers and to determine the allocation of customer demands to distribution centers. The main feature of this research is solving the model with unknown demand function which is suitable with the real-world problems. To consider the uncertainty, a set of possible scenarios for customer demands is created based on the Monte Carlo simulation. The coefficient of variation of costs is mentioned as a measure of risk and the most stable structure for firm's distribution network is defined based on the concept of robust optimization. The best structure is identified using genetic algorithms and 14 % reduction in total supply chain costs is the outcome. Moreover, it imposes the least cost variation created by fluctuation in customer demands (such as epidemic diseases outbreak in some areas of the country) to the logistical system. It is noteworthy that this research is done in one of the largest pharmaceutical distribution firms in Iran.
Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.
2012-01-01
There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.
Simulating future uncertainty to guide the selection of survey designs for long-term monitoring
Garman, Steven L.; Schweiger, E. William; Manier, Daniel J.; Gitzen, Robert A.; Millspaugh, Joshua J.; Cooper, Andrew B.; Licht, Daniel S.
2012-01-01
A goal of environmental monitoring is to provide sound information on the status and trends of natural resources (Messer et al. 1991, Theobald et al. 2007, Fancy et al. 2009). When monitoring observations are acquired by measuring a subset of the population of interest, probability sampling as part of a well-constructed survey design provides the most reliable and legally defensible approach to achieve this goal (Cochran 1977, Olsen et al. 1999, Schreuder et al. 2004; see Chapters 2, 5, 6, 7). Previous works have described the fundamentals of sample surveys (e.g. Hansen et al. 1953, Kish 1965). Interest in survey designs and monitoring over the past 15 years has led to extensive evaluations and new developments of sample selection methods (Stevens and Olsen 2004), of strategies for allocating sample units in space and time (Urquhart et al. 1993, Overton and Stehman 1996, Urquhart and Kincaid 1999), and of estimation (Lesser and Overton 1994, Overton and Stehman 1995) and variance properties (Larsen et al. 1995, Stevens and Olsen 2003) of survey designs. Carefully planned, “scientific” (Chapter 5) survey designs have become a standard in contemporary monitoring of natural resources. Based on our experience with the long-term monitoring program of the US National Park Service (NPS; Fancy et al. 2009; Chapters 16, 22), operational survey designs tend to be selected using the following procedures. For a monitoring indicator (i.e. variable or response), a minimum detectable trend requirement is specified, based on the minimum level of change that would result in meaningful change (e.g. degradation). A probability of detecting this trend (statistical power) and an acceptable level of uncertainty (Type I error; see Chapter 2) within a specified time frame (e.g. 10 years) are specified to ensure timely detection. Explicit statements of the minimum detectable trend, the time frame for detecting the minimum trend, power, and acceptable probability of Type I error (
Uncertainty estimates of a GRACE inversion modelling technique over Greenland using a simulation
Bonin, Jennifer; Chambers, Don
2013-07-01
The low spatial resolution of GRACE causes leakage, where signals in one location spread out into nearby regions. Because of this leakage, using simple techniques such as basin averages may result in an incorrect estimate of the true mass change in a region. A fairly simple least squares inversion technique can be used to more specifically localize mass changes into a pre-determined set of basins of uniform internal mass distribution. However, the accuracy of these higher resolution basin mass amplitudes has not been determined, nor is it known how the distribution of the chosen basins affects the results. We use a simple `truth' model over Greenland as an example case, to estimate the uncertainties of this inversion method and expose those design parameters which may result in an incorrect high-resolution mass distribution. We determine that an appropriate level of smoothing (300-400 km) and process noise (0.30 cm2 of water) gets the best results. The trends of the Greenland internal basins and Iceland can be reasonably estimated with this method, with average systematic errors of 3.5 cm yr-1 per basin. The largest mass losses found from GRACE RL04 occur in the coastal northwest (-19.9 and -33.0 cm yr-1) and southeast (-24.2 and -27.9 cm yr-1), with small mass gains (+1.4 to +7.7 cm yr-1) found across the northern interior. Acceleration of mass change is measurable at the 95 per cent confidence level in four northwestern basins, but not elsewhere in Greenland. Due to an insufficiently detailed distribution of basins across internal Canada, the trend estimates of Baffin and Ellesmere Islands are expected to be incorrect due to systematic errors caused by the inversion technique.
Reyes, J. J.; Adam, J. C.; Tague, C.
2016-12-01
Grasslands play an important role in agricultural production as forage for livestock; they also provide a diverse set of ecosystem services including soil carbon (C) storage. The partitioning of C between above and belowground plant compartments (i.e. allocation) is influenced by both plant characteristics and environmental conditions. The objectives of this study are to 1) develop and evaluate a hybrid C allocation strategy suitable for grasslands, and 2) apply this strategy to examine the importance of various parameters related to biogeochemical cycling, photosynthesis, allocation, and soil water drainage on above and belowground biomass. We include allocation as an important process in quantifying the model parameter uncertainty, which identifies the most influential parameters and what processes may require further refinement. For this, we use the Regional Hydro-ecologic Simulation System, a mechanistic model that simulates coupled water and biogeochemical processes. A Latin hypercube sampling scheme was used to develop parameter sets for calibration and evaluation of allocation strategies, as well as parameter uncertainty analysis. We developed the hybrid allocation strategy to integrate both growth-based and resource-limited allocation mechanisms. When evaluating the new strategy simultaneously for above and belowground biomass, it produced a larger number of less biased parameter sets: 16% more compared to resource-limited and 9% more compared to growth-based. This also demonstrates its flexible application across diverse plant types and environmental conditions. We found that higher parameter importance corresponded to sub- or supra-optimal resource availability (i.e. water, nutrients) and temperature ranges (i.e. too hot or cold). For example, photosynthesis-related parameters were more important at sites warmer than the theoretical optimal growth temperature. Therefore, larger values of parameter importance indicate greater relative sensitivity in
Sreekanth, J.; Moore, Catherine
2018-04-01
The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.
Catastrophic events and older adults.
Cloyd, Elizabeth; Dyer, Carmel B
2010-12-01
The plight of older adults during catastrophic events is a societal concern. Older persons have an increased prevalence of cognitive disorders, chronic illnesses, and mobility problems that limit their ability to cope. These disorders may result in a lack of mental capacity and the ability to discern when they should evacuate or resolve problems encountered during a catastrophe. Some older persons may have limited transportation options, and many of the elderly survivors are at increased risk for abuse, neglect, and exploitation. Recommendations for future catastrophic events include the development of a federal tracking system for elders and other vulnerable adults, the designation of separate shelter areas for elders and other vulnerable adults, and involvement of gerontological professionals in all aspects of emergency preparedness and care delivery, including training of frontline workers. Preparation through preevent planning that includes region-specific social services, medical and public health resources, volunteers, and facilities for elders and vulnerable adults is critical. Elders need to be protected from abuse and fraud during catastrophic events. A public health triage system for elders and other vulnerable populations in pre- and postdisaster situations is useful, and disaster preparedness is paramount. Communities and members of safety and rescue teams must address ethical issues before an event. When older adults are involved, consideration needs to be given to triage decision making, transporting those who are immobile, the care of older adults who receive palliative care, and the equitable distribution of resources. Nurses are perfectly equipped with the skills, knowledge, and training needed to plan and implement disaster preparedness programs. In keeping with the tradition of Florence Nightingale, nurses can assume several crucial roles in disaster preparedness for older adults. Nurses possess the ability to participate and lead community
Selecting the Most Economic Project under Uncertainty Using Bootstrap Technique and Fuzzy Simulation
Directory of Open Access Journals (Sweden)
Kamran Shahanaghi
2012-01-01
Full Text Available This article, by leaving pre-determined membership function of a fuzzy set which is a basic assumption for such subject, will try to propose a hybrid technique to select the most economic project among alternative projects in fuzziness interest rates condition. In this way, net present worth (NPW would be the economic indicator. This article tries to challenge the assumption of large sample sizes availability for membership function determination and shows that some other techniques may have less accuracy. To give a robust solution, bootstrapping and fuzzy simulation is suggested and a numerical example is given and analyzed.
Coping with ecological catastrophe: crossing major thresholds
Directory of Open Access Journals (Sweden)
John Cairns, Jr.
2004-08-01
Full Text Available The combination of human population growth and resource depletion makes catastrophes highly probable. No long-term solutions to the problems of humankind will be discovered unless sustainable use of the planet is achieved. The essential first step toward this goal is avoiding or coping with global catastrophes that result from crossing major ecological thresholds. Decreasing the number of global catastrophes will reduce the risks associated with destabilizing ecological systems, which could, in turn, destabilize societal systems. Many catastrophes will be local, regional, or national, but even these upheavals will have global consequences. Catastrophes will be the result of unsustainable practices and the misuse of technology. However, avoiding ecological catastrophes will depend on the development of eco-ethics, which is subject to progressive maturation, comments, and criticism. Some illustrative catastrophes have been selected to display some preliminary issues of eco-ethics.
Golias, Mihalis M.
2011-01-01
Berth scheduling is a critical function at marine container terminals and determining the best berth schedule depends on several factors including the type and function of the port, size of the port, location, nearby competition, and type of contractual agreement between the terminal and the carriers. In this paper we formulate the berth scheduling problem as a bi-objective mixed-integer problem with the objective to maximize customer satisfaction and reliability of the berth schedule under the assumption that vessel handling times are stochastic parameters following a discrete and known probability distribution. A combination of an exact algorithm, a Genetic Algorithms based heuristic and a simulation post-Pareto analysis is proposed as the solution approach to the resulting problem. Based on a number of experiments it is concluded that the proposed berth scheduling policy outperforms the berth scheduling policy where reliability is not considered.
Process simulation and uncertainty analysis of plasma arc mixed waste treatment
International Nuclear Information System (INIS)
Ferrada, J.J.; Welch, T.D.
1994-01-01
Innovative mixed waste treatment subsystems have been analyzed for performance, risk, and life-cycle cost as part of the U.S. Department of Energy's (DOE)'s Mixed Waste Integrated Program (MWIP) treatment alternatives development and evaluation process. This paper concerns the analysis of mixed waste treatment system performance. Performance systems analysis includes approximate material and energy balances and assessments of operability, effectiveness, and reliability. Preliminary material and energy balances of innovative processes have been analyzed using FLOW, an object-oriented, process simulator for waste management systems under development at Oak Ridge National Laboratory. The preliminary models developed for FLOW provide rough order-of-magnitude calculations useful for sensitivity analysis. The insight gained from early modeling of these technologies approximately will ease the transition to more sophisticated simulators as adequate performance and property data become available. Such models are being developed in ASPEN by DOE's Mixed Waste Treatment Project (MWTP) for baseline and alternative flow sheets based on commercial technologies. One alternative to the baseline developed by the MWIP support groups in plasma arc treatment. This process offers a noticeable reduction in the number of process operations as compared to the baseline process because a plasma arc melter is capable of accepting a wide variety of waste streams as direct inputs (without sorting or preprocessing). This innovative process for treating mixed waste replaces several units from the baseline process and, thus, promises an economic advantage. The performance in the plasma arc furnace will directly affect the quality of the waste form and the requirements of the off-gas treatment units. The ultimate objective of MWIP is to reduce the amount of final waste produced, the cost, and the environmental impact
Propagation of uncertainty by Monte Carlo simulations in case of basic geodetic computations
Directory of Open Access Journals (Sweden)
Wyszkowska Patrycja
2017-12-01
Full Text Available The determination of the accuracy of functions of measured or adjusted values may be a problem in geodetic computations. The general law of covariance propagation or in case of the uncorrelated observations the propagation of variance (or the Gaussian formula are commonly used for that purpose. That approach is theoretically justified for the linear functions. In case of the non-linear functions, the first-order Taylor series expansion is usually used but that solution is affected by the expansion error. The aim of the study is to determine the applicability of the general variance propagation law in case of the non-linear functions used in basic geodetic computations. The paper presents errors which are a result of negligence of the higher-order expressions and it determines the range of such simplification. The basis of that analysis is the comparison of the results obtained by the law of propagation of variance and the probabilistic approach, namely Monte Carlo simulations. Both methods are used to determine the accuracy of the following geodetic computations: the Cartesian coordinates of unknown point in the three-point resection problem, azimuths and distances of the Cartesian coordinates, height differences in the trigonometric and the geometric levelling. These simulations and the analysis of the results confirm the possibility of applying the general law of variance propagation in basic geodetic computations even if the functions are non-linear. The only condition is the accuracy of observations, which cannot be too low. Generally, this is not a problem with using present geodetic instruments.
Propagation of uncertainty by Monte Carlo simulations in case of basic geodetic computations
Wyszkowska, Patrycja
2017-12-01
The determination of the accuracy of functions of measured or adjusted values may be a problem in geodetic computations. The general law of covariance propagation or in case of the uncorrelated observations the propagation of variance (or the Gaussian formula) are commonly used for that purpose. That approach is theoretically justified for the linear functions. In case of the non-linear functions, the first-order Taylor series expansion is usually used but that solution is affected by the expansion error. The aim of the study is to determine the applicability of the general variance propagation law in case of the non-linear functions used in basic geodetic computations. The paper presents errors which are a result of negligence of the higher-order expressions and it determines the range of such simplification. The basis of that analysis is the comparison of the results obtained by the law of propagation of variance and the probabilistic approach, namely Monte Carlo simulations. Both methods are used to determine the accuracy of the following geodetic computations: the Cartesian coordinates of unknown point in the three-point resection problem, azimuths and distances of the Cartesian coordinates, height differences in the trigonometric and the geometric levelling. These simulations and the analysis of the results confirm the possibility of applying the general law of variance propagation in basic geodetic computations even if the functions are non-linear. The only condition is the accuracy of observations, which cannot be too low. Generally, this is not a problem with using present geodetic instruments.
3-D simulations of M9 earthquakes on the Cascadia Megathrust: Key parameters and uncertainty
Wirth, Erin; Frankel, Arthur; Vidale, John; Marafi, Nasser A.; Stephenson, William J.
2017-01-01
Geologic and historical records indicate that the Cascadia subduction zone is capable of generating large, megathrust earthquakes up to magnitude 9. The last great Cascadia earthquake occurred in 1700, and thus there is no direct measure on the intensity of ground shaking or specific rupture parameters from seismic recordings. We use 3-D numerical simulations to generate broadband (0-10 Hz) synthetic seismograms for 50 M9 rupture scenarios on the Cascadia megathrust. Slip consists of multiple high-stress drop subevents (~M8) with short rise times on the deeper portion of the fault, superimposed on a background slip distribution with longer rise times. We find a >4x variation in the intensity of ground shaking depending upon several key parameters, including the down-dip limit of rupture, the slip distribution and location of strong-motion-generating subevents, and the hypocenter location. We find that extending the down-dip limit of rupture to the top of the non-volcanic tremor zone results in a ~2-3x increase in peak ground acceleration for the inland city of Seattle, Washington, compared to a completely offshore rupture. However, our simulations show that allowing the rupture to extend to the up-dip limit of tremor (i.e., the deepest rupture extent in the National Seismic Hazard Maps), even when tapering the slip to zero at the down-dip edge, results in multiple areas of coseismic coastal uplift. This is inconsistent with coastal geologic evidence (e.g., buried soils, submerged forests), which suggests predominantly coastal subsidence for the 1700 earthquake and previous events. Defining the down-dip limit of rupture as the 1 cm/yr locking contour (i.e., mostly offshore) results in primarily coseismic subsidence at coastal sites. We also find that the presence of deep subevents can produce along-strike variations in subsidence and ground shaking along the coast. Our results demonstrate the wide range of possible ground motions from an M9 megathrust earthquake in
Soleymani Shishvan, M.; Benndorf, J.
Continuous mining systems containing multiple excavators producing multiple products of raw materials are highly complex, exhibiting strong interdependency between constituents. Furthermore, random variables govern the system, which causes uncertainty in the supply of raw materials: uncertainty in
Mahesh, A.; Mudigonda, M.; Kim, S. K.; Kashinath, K.; Kahou, S.; Michalski, V.; Williams, D. N.; Liu, Y.; Prabhat, M.; Loring, B.; O'Brien, T. A.; Collins, W. D.
2017-12-01
Atmospheric rivers (ARs) can be the difference between CA facing drought or hurricane-level storms. ARs are a form of extreme weather defined as long, narrow columns of moisture which transport water vapor outside the tropics. When they make landfall, they release the vapor as rain or snow. Convolutional neural networks (CNNs), a machine learning technique that uses filters to recognize features, are the leading computer vision mechanism for classifying multichannel images. CNNs have been proven to be effective in identifying extreme weather events in climate simulation output (Liu et. al. 2016, ABDA'16, http://bit.ly/2hlrFNV). Here, we compare three different CNN architectures, tuned with different hyperparameters and training schemes. We compare two-layer, three-layer, four-layer, and sixteen-layer CNNs' ability to recognize ARs in Community Atmospheric Model version 5 output, and we explore the ability of data augmentation and pre-trained models to increase the accuracy of the classifier. Because pre-training the model with regular images (i.e. benches, stoves, and dogs) yielded the highest accuracy rate, this strategy, also known as transfer learning, may be vital in future scientific CNNs, which likely will not have access to a large labelled training dataset. By choosing the most effective CNN architecture, climate scientists can build an accurate historical database of ARs, which can be used to develop a predictive understanding of these phenomena.
Energy Technology Data Exchange (ETDEWEB)
Lawson, Matthew; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik; Frank, Jonathan H.
2010-09-01
Recent advances in high frame rate complementary metal-oxide-semiconductor (CMOS) cameras coupled with high repetition rate lasers have enabled laser-based imaging measurements of the temporal evolution of turbulent reacting flows. This measurement capability provides new opportunities for understanding the dynamics of turbulence-chemistry interactions, which is necessary for developing predictive simulations of turbulent combustion. However, quantitative imaging measurements using high frame rate CMOS cameras require careful characterization of the their noise, non-linear response, and variations in this response from pixel to pixel. We develop a noise model and calibration tools to mitigate these problems and to enable quantitative use of CMOS cameras. We have demonstrated proof of principle for image de-noising using both wavelet methods and Bayesian inference. The results offer new approaches for quantitative interpretation of imaging measurements from noisy data acquired with non-linear detectors. These approaches are potentially useful in many areas of scientific research that rely on quantitative imaging measurements.
Guo, Changning; Doub, William H; Kauffman, John F
2010-08-01
Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association
Directory of Open Access Journals (Sweden)
F. Hossain
2004-01-01
Full Text Available This study presents a simple and efficient scheme for Bayesian estimation of uncertainty in soil moisture simulation by a Land Surface Model (LSM. The scheme is assessed within a Monte Carlo (MC simulation framework based on the Generalized Likelihood Uncertainty Estimation (GLUE methodology. A primary limitation of using the GLUE method is the prohibitive computational burden imposed by uniform random sampling of the model's parameter distributions. Sampling is improved in the proposed scheme by stochastic modeling of the parameters' response surface that recognizes the non-linear deterministic behavior between soil moisture and land surface parameters. Uncertainty in soil moisture simulation (model output is approximated through a Hermite polynomial chaos expansion of normal random variables that represent the model's parameter (model input uncertainty. The unknown coefficients of the polynomial are calculated using limited number of model simulation runs. The calibrated polynomial is then used as a fast-running proxy to the slower-running LSM to predict the degree of representativeness of a randomly sampled model parameter set. An evaluation of the scheme's efficiency in sampling is made through comparison with the fully random MC sampling (the norm for GLUE and the nearest-neighborhood sampling technique. The scheme was able to reduce computational burden of random MC sampling for GLUE in the ranges of 10%-70%. The scheme was also found to be about 10% more efficient than the nearest-neighborhood sampling method in predicting a sampled parameter set's degree of representativeness. The GLUE based on the proposed sampling scheme did not alter the essential features of the uncertainty structure in soil moisture simulation. The scheme can potentially make GLUE uncertainty estimation for any LSM more efficient as it does not impose any additional structural or distributional assumptions.
The Climate Catastrophe as Blockbuster
DEFF Research Database (Denmark)
Eskjær, Mikkel Fugl
2013-01-01
Modern disaster films constitute a specific cultural form that speaks to the anxieties of the “risk society.” This essay looks at how risks like climate change is presented and constructed in popular culture. It regards blockbuster representations as part of a wider discourse of “catastrophism......” within the realm of public climate change communication. For that reason, the essay centers on the interplay between news media and entertainment. It argues that blockbuster disaster films represent an inversion of traditional risk and disaster news....
A catastrophe in quantum mechanics
International Nuclear Information System (INIS)
Ignatovich, V.K.
2004-01-01
The standard scattering theory (SST) in nonrelativistic quantum mechanics (QM) is analyzed. Self-contradictions of SST are deconstructed. A direct way to calculate scattering probability without introduction of a finite volume is discussed. Substantiation of SST in textbooks with the help of wave packets is shown to be incomplete. A complete theory of wave packet scattering on a fixed center is presented, and its similarity to the plane wave scattering is demonstrated. The neutron scattering on a monatomic gas is investigated, and several problems are pointed out. A catastrophic ambiguity of the cross section is revealed, and a way to resolve this ambiguity is discussed
Energy Technology Data Exchange (ETDEWEB)
Deng, Hailin [Los Alamos National Laboratory; Dai, Zhenxue [Los Alamos National Laboratory; Jiao, Zunsheng [Wyoming State Geological Survey; Stauffer, Philip H. [Los Alamos National Laboratory; Surdam, Ronald C. [Wyoming State Geological Survey
2011-01-01
Many geological, geochemical, geomechanical and hydrogeological factors control CO{sub 2} storage in subsurface. Among them heterogeneity in saline aquifer can seriously influence design of injection wells, CO{sub 2} injection rate, CO{sub 2} plume migration, storage capacity, and potential leakage and risk assessment. This study applies indicator geostatistics, transition probability and Markov chain model at the Rock Springs Uplift, Wyoming generating facies-based heterogeneous fields for porosity and permeability in target saline aquifer (Pennsylvanian Weber sandstone) and surrounding rocks (Phosphoria, Madison and cap-rock Chugwater). A multiphase flow simulator FEHM is then used to model injection of CO{sub 2} into the target saline aquifer involving field-scale heterogeneity. The results reveal that (1) CO{sub 2} injection rates in different injection wells significantly change with local permeability distributions; (2) brine production rates in different pumping wells are also significantly impacted by the spatial heterogeneity in permeability; (3) liquid pressure evolution during and after CO{sub 2} injection in saline aquifer varies greatly for different realizations of random permeability fields, and this has potential important effects on hydraulic fracturing of the reservoir rock, reactivation of pre-existing faults and the integrity of the cap-rock; (4) CO{sub 2} storage capacity estimate for Rock Springs Uplift is 6614 {+-} 256 Mt at 95% confidence interval, which is about 36% of previous estimate based on homogeneous and isotropic storage formation; (5) density profiles show that the density of injected CO{sub 2} below 3 km is close to that of the ambient brine with given geothermal gradient and brine concentration, which indicates CO{sub 2} plume can sink to the deep before reaching thermal equilibrium with brine. Finally, we present uncertainty analysis of CO{sub 2} leakage into overlying formations due to heterogeneity in both the target saline
Hogrefe, Christian; Isukapalli, Sastry S; Tang, Xiaogang; Georgopoulos, Panos G; He, Shan; Zalewsky, Eric E; Hao, Winston; Ku, Jia-Yeong; Key, Tonalee; Sistla, Gopal
2011-01-01
The role of emissions of volatile organic compounds and nitric oxide from biogenic sources is becoming increasingly important in regulatory air quality modeling as levels of anthropogenic emissions continue to decrease and stricter health-based air quality standards are being adopted. However, considerable uncertainties still exist in the current estimation methodologies for biogenic emissions. The impact of these uncertainties on ozone and fine particulate matter (PM2.5) levels for the eastern United States was studied, focusing on biogenic emissions estimates from two commonly used biogenic emission models, the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and the Biogenic Emissions Inventory System (BEIS). Photochemical grid modeling simulations were performed for two scenarios: one reflecting present day conditions and the other reflecting a hypothetical future year with reductions in emissions of anthropogenic oxides of nitrogen (NOx). For ozone, the use of MEGAN emissions resulted in a higher ozone response to hypothetical anthropogenic NOx emission reductions compared with BEIS. Applying the current U.S. Environmental Protection Agency guidance on regulatory air quality modeling in conjunction with typical maximum ozone concentrations, the differences in estimated future year ozone design values (DVF) stemming from differences in biogenic emissions estimates were on the order of 4 parts per billion (ppb), corresponding to approximately 5% of the daily maximum 8-hr ozone National Ambient Air Quality Standard (NAAQS) of 75 ppb. For PM2.5, the differences were 0.1-0.25 microg/m3 in the summer total organic mass component of DVFs, corresponding to approximately 1-2% of the value of the annual PM2.5 NAAQS of 15 microg/m3. Spatial variations in the ozone and PM2.5 differences also reveal that the impacts of different biogenic emission estimates on ozone and PM2.5 levels are dependent on ambient levels of anthropogenic emissions.
DEFF Research Database (Denmark)
Breinholt, Anders; Møller, Jan Kloppenborg; Madsen, Henrik
2012-01-01
While there seems to be consensus that hydrological model outputs should be accompanied with an uncertainty estimate the appropriate method for uncertainty estimation is not agreed upon and a debate is ongoing between advocators of formal statistical methods who consider errors as stochastic...... and GLUE advocators who consider errors as epistemic, arguing that the basis of formal statistical approaches that requires the residuals to be stationary and conform to a statistical distribution is unrealistic. In this paper we take a formal frequentist approach to parameter estimation and uncertainty...... necessary but the statistical assumptions were nevertheless not 100% justified. The residual analysis showed that significant autocorrelation was present for all simulation models. We believe users of formal approaches to uncertainty evaluation within hydrology and within environmental modelling in general...
Gravothermal catastrophe of finite amplitude
Energy Technology Data Exchange (ETDEWEB)
Hachisu, I; Sugimoto, D [Tokyo Univ. (Japan). Coll. of General Education; Nakada, Y; Nomoto, K
1978-08-01
Development of the gravothermal catastrophe is followed numerically for self-gravitating gas system enclosed by an adiabatic wall, which is isothermal in the initial state. It is found that the final fate of the catastrophe is in two ways depending on the initial perturbations. When the initial perturbation produces a temperature distribution decreasing outward, the contraction proceeds in the central region and the central density increases unlimitedly, as the heat flows outward. When the initial temperature distribution is increasing outward, on the other hand, the central region expands as the heat flows into the central region. Then the density contrast is reduced and finally the system reaches another isothermal configuration with the same energy but with a lower density contrast and a higher entropy. This final configuration is gravothermally stable and may be called a thermal system. In the former case of the unlimited contraction, the final density profile is determined essentially by the density and temperature dependence of the heat conductivity. In the case of a system under the force of the inverse square law, the final density distribution is well approximated by a power law so that the mass contained in the condensed core is relatively small. A possibility of formation of a black hole in stellar systems is also discussed.
Gravothermal catastrophe of finite amplitude
International Nuclear Information System (INIS)
Hachisu, Izumi; Sugimoto, Daiichiro; Nakada, Yoshikazu; Nomoto, Ken-ichi.
1978-01-01
Development of the gravothermal catastrophe is followed numerically for self-gravitating gas system enclosed by an adiabatic wall, which is isothermal in the initial state. It is found that the final fate of the catastrophe is in two ways depending on the initial perturbations. When the initial perturbation produces a temperature distribution decreasing outward, the contraction proceeds in the central region and the central density increases unlimitedly, as the heat flows outward. When the initial temperature distribution is increasing outward, on the other hand, the central region expands as the heat flows into the central region. Then the density contrast is reduced and finally the system reaches another isothermal configuration with the same energy but with a lower density contrast and a higher entropy. This final configuration is gravothermally stable and may be called a thermal system. In the former case of the unlimited contraction, the final density profile is determined essentially by the density and temperature dependence of the heat conductivity. In the case of a system under the force of the inverse square law, the final density distribution is well approximated by a power law so that the mass contained in the condensed core is relatively small. A possibility of formation of a black hole in stellar systems is also discussed. (author)
How are the catastrophical risks quantifiable
International Nuclear Information System (INIS)
Chakraborty, S.
1985-01-01
For the assessment and evaluation of industrial risks the question must be asked how are the catastrophical risks quantifiable. Typical real catastrophical risks and risk assessment based on modelling assumptions have been placed against each other in order to put the risks into proper perspective. However, the society is risk averse when there is a catastrophic potential of severe accidents in a large scale industrial facility even though there is extremely low probability of occurence. (orig.) [de
Theory of a slow-light catastrophe
International Nuclear Information System (INIS)
Leonhardt, Ulf
2002-01-01
In diffraction catastrophes such as the rainbow, the wave nature of light resolves ray singularities and draws delicate interference patterns. In quantum catastrophes such as the black hole, the quantum nature of light resolves wave singularities and creates characteristic quantum effects related to Hawking radiation. This paper describes the theory behind a recent proposal [U. Leonhardt, Nature (London) 415, 406 (2002)] to generate a quantum catastrophe of slow light
Theory of a slow-light catastrophe
Leonhardt, Ulf
2002-04-01
In diffraction catastrophes such as the rainbow, the wave nature of light resolves ray singularities and draws delicate interference patterns. In quantum catastrophes such as the black hole, the quantum nature of light resolves wave singularities and creates characteristic quantum effects related to Hawking radiation. This paper describes the theory behind a recent proposal [U. Leonhardt, Nature (London) 415, 406 (2002)] to generate a quantum catastrophe of slow light.
Theory of a Slow-Light Catastrophe
Leonhardt, Ulf
2001-01-01
In diffraction catastrophes such as the rainbow the wave nature of light resolves ray singularities and draws delicate interference patterns. In quantum catastrophes such as the black hole the quantum nature of light resolves wave singularities and creates characteristic quantum effects related to Hawking radiation. The paper describes the theory behind a recent proposal [U. Leonhardt, arXiv:physics/0111058, Nature (in press)] to generate a quantum catastrophe of slow light.
Extensional rheometer based on viscoelastic catastrophes outline
DEFF Research Database (Denmark)
2014-01-01
The present invention relates to a method and a device for determining viscoelastic properties of a fluid. The invention resides inter alia in the generation of viscoelastic catastrophes in confined systems for use in the context of extensional rheology. The viscoelastic catastrophe is according ...... to the invention generated in a bistable fluid system, and the flow conditions for which the catastrophe occurs can be used as a fingerprint of the fluid's viscoelastic properties in extensional flow....
Catastrophizing in Patients with Burning Mouth Syndrome
Directory of Open Access Journals (Sweden)
Ana ANDABAK ROGULJ
2014-01-01
Full Text Available Background: Burning mouth syndrome (BMS is an idiopathic painful condition which manifests with burning sensations in the oral cavity in patients with clinically normal oral mucosa and without any local and/or systemic causative factor. Catastrophizing is defined as an exaggerated negative orientation toward pain stimuli and pain experience. The aim of this study was to examine the association between catastrophizing and clinical parameters of BMS, and to examine the association between catastrophizing and the quality of life in patients with BMS. Materials and methods: Anonymous questionnaire consisting of 3 parts (demographic and clinical data with 100 mm visual analogue scale (VAS, Croatian version of the Oral Health Impact Profile (OHIP-14 scale and Croatian version of the Pain Catastrophizing scale (PC, was distributed to 30 patients diagnosed with BMS. Results: A higher level of catastrophizing was clinically significant in 30% of the patients. Total catastrophizing score and all three subcomponents of catastrophizing significantly correlated with the intensity of symptoms, but did not correlate with the duration of symptoms. Gender and previous treatment did not affect the catastrophizing. Conclusion: Obtaining the information about catastrophizing could help a clinician to identify patients with negative behavioural patterns. Additional psychological intervention in these individuals could reduce/eliminate negative cognitive factors and improve coping with chronic painful condition such as BMS.
Schiemann, Reinhard; Roberts, Charles J.; Bush, Stephanie; Demory, Marie-Estelle; Strachan, Jane; Vidale, Pier Luigi; Mizielinski, Matthew S.; Roberts, Malcolm J.
2015-04-01
Precipitation over land exhibits a high degree of variability due to the complex interaction of the precipitation generating atmospheric processes with coastlines, the heterogeneous land surface, and orography. Global general circulation models (GCMs) have traditionally had very limited ability to capture this variability on the mesoscale (here ~50-500 km) due to their low resolution. This has changed with recent investments in resolution and ensembles of multidecadal climate simulations of atmospheric GCMs (AGCMs) with ~25 km grid spacing are becoming increasingly available. Here, we evaluate the mesoscale precipitation distribution in one such set of simulations obtained in the UPSCALE (UK on PrACE - weather-resolving Simulations of Climate for globAL Environmental risk) modelling campaign with the HadGEM-GA3 AGCM. Increased model resolution also poses new challenges to the observational datasets used to evaluate models. Global gridded data products such as those provided by the Global Precipitation Climatology Project (GPCP) are invaluable for assessing large-scale features of the precipitation distribution but may not sufficiently resolve mesoscale structures. In the absence of independent estimates, the intercomparison of different observational datasets may be the only way to get some insight into the uncertainties associated with these observations. Here, we focus on mid-latitude continental regions where observations based on higher-density gauge networks are available in addition to the global data sets: Europe/the Alps, South and East Asia, and the continental US. The ability of GCMs to represent mesoscale variability is of interest in its own right, as climate information on this scale is required by impact studies. An additional motivation for the research proposed here arises from continuing efforts to quantify the components of the global radiation budget and water cycle. Recent estimates based on radiation measurements suggest that the global mean
Energy Technology Data Exchange (ETDEWEB)
Arkoma, Asko, E-mail: asko.arkoma@vtt.fi; Ikonen, Timo
2016-08-15
Highlights: • A sensitivity analysis using the data from EPR LB-LOCA simulations is done. • A procedure to analyze such complex data is outlined. • Both visual and quantitative methods are used. • Input factors related to core design are identified as most significant. - Abstract: In this paper, a sensitivity analysis for the data originating from a large break loss-of-coolant accident (LB-LOCA) analysis of an EPR-type nuclear power plant is presented. In the preceding LOCA analysis, the number of failing fuel rods in the accident was established (Arkoma et al., 2015). However, the underlying causes for rod failures were not addressed. It is essential to bring out which input parameters and boundary conditions have significance to the outcome of the analysis, i.e. the ballooning and burst of the rods. Due to complexity of the existing data, the first part of the analysis consists of defining the relevant input parameters for the sensitivity analysis. Then, selected sensitivity measures are calculated between the chosen input and output parameters. The ultimate goal is to develop a systematic procedure for the sensitivity analysis of statistical LOCA simulation that takes into account the various sources of uncertainties in the calculation chain. In the current analysis, the most relevant parameters with respect to the cladding integrity are the decay heat power during the transient, the thermal hydraulic conditions in the rod’s location in reactor, and the steady-state irradiation history of the rod. Meanwhile, the tolerances in fuel manufacturing parameters were found to have negligible effect on cladding deformation.
CATASTROPHIC DISRUPTION OF COMET ISON
Energy Technology Data Exchange (ETDEWEB)
Keane, Jacqueline V.; Kleyna, Jan T.; Riesen, Timm-Emmanuel; Meech, Karen J. [Institute for Astronomy, University of Hawaii, 2680 Woodlawn Drive, Honolulu, HI 96822 (United States); Milam, Stefanie N.; Charnley, Steven B. [Astrochemistry Laboratory, NASA GSFC, MS 690, Greenbelt, MD 20771 (United States); Coulson, Iain M. [Joint Astronomy Center, 660 North Aohoku Place, Hilo, HI 96720 (United States); Sekanina, Zdenek [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Pasadena, CA 91109 (United States); Kracht, Rainer, E-mail: keane@ifa.hawaii.edu [Ostlandring 53, D-25335 Elmshorn, Schleswig-Holstein (Germany)
2016-11-10
We report submillimeter 450 and 850 μ m dust continuum observations for comet C/2012 S1 (ISON) obtained at heliocentric distances 0.31–0.08 au prior to perihelion on 2013 November 28 ( r {sub h} = 0.0125 au). These observations reveal a rapidly varying dust environment in which the dust emission was initially point-like. As ISON approached perihelion, the continuum emission became an elongated dust column spread out over as much as 60″ (>10{sup 5} km) in the anti-solar direction. Deconvolution of the November 28.04 850 μ m image reveals numerous distinct clumps consistent with the catastrophic disruption of comet ISON, producing ∼5.2 × 10{sup 10} kg of submillimeter-sized dust. Orbital computations suggest that the SCUBA-2 emission peak coincides with the comet's residual nucleus.
Catastrophe Finance: An Emerging Discipline
Elsner, James B.; Burch, R. King; Jagger, Thomas H.
2009-08-01
While the recent disasters in the world's financial markets demonstrate that finance theory remains far from perfected, science also faces steep challenges in the quest to predict and manage the effects of natural disasters. Worldwide, as many as half a million people have died in disasters such as earthquakes, tsunamis, and tropical cyclones since the turn of the 21st century [Wirtz, 2008]. Further, natural disasters can lead to extreme financial losses, and independent financial collapses can be exacerbated by natural disasters. In financial cost, 2008 was the second most expensive year on record for such catastrophes and for financial market declines. These extreme events in the natural and financial realms push the issue of risk management to the fore, expose the deficiencies of existing knowledge and practice, and suggest that progress requires further research and training at the graduate level.
Catastrophic Disruption of Comet ISON
Keane, Jacqueline V.; Milam, Stefanie N.; Coulson, Iain M.; Kleyna, Jan T.; Sekanina, Zdenek; Kracht, Rainer; Riesen, Timm-Emmanuel; Meech, Karen J.; Charnley, Steven B.
2016-01-01
We report submillimeter 450 and 850 microns dust continuum observations for comet C/2012 S1 (ISON) obtained at heliocentric distances 0.31-0.08 au prior to perihelion on 2013 November 28 (rh?=?0.0125 au). These observations reveal a rapidly varying dust environment in which the dust emission was initially point-like. As ISON approached perihelion, the continuum emission became an elongated dust column spread out over as much as 60? (greater than 10(exp 5) km in the anti-solar direction. Deconvolution of the November 28.04 850 microns image reveals numerous distinct clumps consistent with the catastrophic disruption of comet ISON, producing approximately 5.2?×?10(exp 10) kg of submillimeter-sized dust. Orbital computations suggest that the SCUBA-2 emission peak coincides with the comet's residual nucleus.
Climate Catastrophe - The Giant Swindle
International Nuclear Information System (INIS)
Doerell, P. E.
1998-01-01
Energy is the life-blood of civilization. More than 80% of global energy is supplied by fossil fuels. And this will continue for the foreseeable future - if an implementation of the Kyoto Protocol does not lead to a dramatic decrease of these fuels causing worldwide turmoil of unprecedented dimensions. However, the scaremongering with a 'climate catastrophe' allegedly caused by 'greenhouse gas' emissions from the burning of fossil fuels is a huge hoax. Its only 'scientific' base is the IPCC management's enigmatic assessment: 'The balance of evidence suggests a discernable human influence on climate'. But even IPCC had to admit at the World Energy Conference in Tokyo in 1996: 'We have no evidence'. And all the scaremongering assertions of the protagonists of 'global warming' have been convincingly refuted by the world elite of scientists. This paper will: - show how the whole anti-CO 2 campaign has been manipulated from the very beginning till today; - give great many scientific and logical reason why the arguments of the scaremongers are incorrect; - outline the catastrophic economic and social consequences of the proposed anti-CO 2 measures - without any benefit for the environment of climate; - name the driving forces behind this campaign and their interests. The witchhunt against CO 2 is an incredible scientific and political scandal, CO 2 does not damage the environment at all, and labelling it a 'climate killer' is absurd. On the contrary, this gas is vital for the life on our plant, and a stronger concentration of CO 2 will be beneficial by doubling plant growth and with this combatting global famine. And to pretend that we could influence - with a CO 2 tax - the climate, is insane arrogance. Man is absolutely helpless when confronted with the forces of nature. The squandering of multimillions USD of taxpayer's money for the travelling circus of 'Climate summits' and the stultification of the population must stop. The 'global warming' lie is the biggest
Tang, S.; Xie, S.; Tang, Q.; Zhang, Y.
2017-12-01
Two types of instruments, the eddy correlation flux measurement system (ECOR) and the energy balance Bowen ratio system (EBBR), are used at the Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site to measure surface latent and sensible fluxes. ECOR and EBBR typically sample different land surface types, and the domain-mean surface fluxes derived from ECOR and EBBR are not always consistent. The uncertainties of the surface fluxes will have impacts on the derived large-scale forcing data and further affect the simulations of single-column models (SCM), cloud-resolving models (CRM) and large-eddy simulation models (LES), especially for the shallow-cumulus clouds which are mainly driven by surface forcing. This study aims to quantify the uncertainties of the large-scale forcing caused by surface turbulence flux measurements and investigate the impacts on cloud simulations using long-term observations from the ARM SGP site.
Energy Technology Data Exchange (ETDEWEB)
Morales Prieto, M.; Ortega Saiz, P.
2011-07-01
Analysis of analytical uncertainties of the methodology of simulation of processes for obtaining isotopic ending inventory of spent fuel, the ARIANE experiment explores the part of simulation of burning.
International Nuclear Information System (INIS)
Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.
2016-01-01
Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has
Energy Technology Data Exchange (ETDEWEB)
Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.
2016-11-01
Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach
Adaptation to and Recovery from Global Catastrophe
Directory of Open Access Journals (Sweden)
Seth D. Baum
2013-03-01
Full Text Available Global catastrophes, such as nuclear war, pandemics and ecological collapse threaten the sustainability of human civilization. To date, most work on global catastrophes has focused on preventing the catastrophes, neglecting what happens to any catastrophe survivors. To address this gap in the literature, this paper discusses adaptation to and recovery from global catastrophe. The paper begins by discussing the importance of global catastrophe adaptation and recovery, noting that successful adaptation/recovery could have value on even astronomical scales. The paper then discusses how the adaptation/recovery could proceed and makes connections to several lines of research. Research on resilience theory is considered in detail and used to develop a new method for analyzing the environmental and social stressors that global catastrophe survivors would face. This method can help identify options for increasing survivor resilience and promoting successful adaptation and recovery. A key point is that survivors may exist in small isolated communities disconnected from global trade and, thus, must be able to survive and rebuild on their own. Understanding the conditions facing isolated survivors can help promote successful adaptation and recovery. That said, the processes of global catastrophe adaptation and recovery are highly complex and uncertain; further research would be of great value.
Catastrophe theory with application in nuclear technology
International Nuclear Information System (INIS)
Valeca, Serban Constantin
2002-01-01
The monograph is structured on the following seven chapters: 1. Correlation of risk, catastrophe and chaos at the level of polyfunctional systems with nuclear injection; 1.1 Approaching the risk at the level of power systems; 1.2 Modelling the chaos-catastrophe-risk correlation in the structure of integrated classical and nuclear processes; 2. Catastrophe theory applied in ecosystems models and applications; 2.1 Posing the problems in catastrophe theory; 2.2 Application of catastrophe theory in the engineering of the power ecosystems with nuclear injection; 4.. Decision of abatement of the catastrophic risk based on minimal costs; 4.1 The nuclear power systems sensitive to risk-catastrophe-chaos in the structure of minimal costs; 4.2 Evaluating the market structure on the basis of power minimal costs; 4.3 Decisions in power systems built on minimal costs; 5. Models of computing the minimal costs in classical and nuclear power systems; 5.1 Calculation methodologies of power minimal cost; 5.2 Calculation methods of minimal costs in nuclear power sector; 6. Expert and neuro expert systems for supervising the risk-catastrophe-chaos correlation; 6.1 The structure of expert systems; 6.2 Application of the neuro expert program; 7. Conclusions and operational proposals; 7.1 A synthesis of the problems presented in this work; 7.2 Highlighting the novel aspects applicable in the power systems with nuclear injection
Does catastrophic thinking enhance oesophageal pain sensitivity?
DEFF Research Database (Denmark)
Martel, M O; Olesen, A E; Jørgensen, D
2016-01-01
that catastrophic thinking exerts an influence on oesophageal pain sensitivity, but not necessarily on the magnitude of acid-induced oesophageal sensitization. WHAT DOES THIS STUDY ADD?: Catastrophizing is associated with heightened pain sensitivity in the oesophagus. This was substantiated by assessing responses...
Karakoylu, E.; Franz, B.
2016-01-01
First attempt at quantifying uncertainties in ocean remote sensing reflectance satellite measurements. Based on 1000 iterations of Monte Carlo. Data source is a SeaWiFS 4-day composite, 2003. The uncertainty is for remote sensing reflectance (Rrs) at 443 nm.
Physical Uncertainty Bounds (PUB)
Energy Technology Data Exchange (ETDEWEB)
Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2015-03-19
This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.
Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.
2016-12-01
This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for
Treatment of catastrophic antiphospholipid syndrome
Kazzaz, Nayef M.; McCune, W. Joseph; Knight, Jason S.
2016-01-01
Purpose of review Catastrophic antiphospholipid syndrome (CAPS) is a severe manifestation of APS. While affecting only 1% of patients with APS, the condition is frequently fatal if not recognized and treated early. Here, we will review the current approach to diagnosis and treatment of CAPS. Recent findings Data from the international “CAPS registry,” spearheaded by the European Forum on Antiphospholipid Antibodies, have improved our understanding of at-risk patients, typical clinical features, and associated/precipitating diagnoses. Current guidelines also continue to support a role for anticoagulants and glucocorticoids as foundation therapy in all patients. Finally, new basic science and case series suggest that novel therapies, such as rituximab and eculizumab warrant further study. Summary Attention to associated diagnoses such as infection and systemic lupus erythematosus (SLE) are critical at the time of diagnosis. All patients should be treated with anticoagulation, corticosteroids, and possibly plasma exchange. In patients with SLE, cyclophosphamide should also be considered. In refractory or relapsing cases, new therapies such as rituximab and possibly eculizumab may be options, but need further study. PMID:26927441
A probabilistic strategy for parametric catastrophe insurance
Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin
2017-04-01
Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss
Directory of Open Access Journals (Sweden)
A. Breinholt
2013-10-01
Full Text Available Monitoring of flows in sewer systems is increasingly applied to calibrate urban drainage models used for long-term simulation. However, most often models are calibrated without considering the uncertainties. The generalized likelihood uncertainty estimation (GLUE methodology is here applied to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction limits without statistical coherence and consistency and for the subjectivity in the choice of a threshold value to distinguish "behavioural" from "non-behavioural" parameter sets. In this paper we examine how well the GLUE methodology performs when the behavioural parameter sets deduced from a calibration period are applied to generate prediction bounds in validation periods. By retaining an increasing number of parameter sets we aim at obtaining consistency between the GLUE generated 90% prediction limits and the actual containment ratio (CR in calibration. Due to the large uncertainties related to spatio-temporal rain variability during heavy convective rain events, flow measurement errors, possible model deficiencies as well as epistemic uncertainties, it was not possible to obtain an overall CR of more than 80%. However, the GLUE generated prediction limits still proved rather consistent, since the overall CRs obtained in calibration corresponded well with the overall CRs obtained in validation periods for all proportions of retained parameter sets evaluated. When focusing on wet and dry weather periods separately, some inconsistencies were however found between calibration and validation and we address here some of the reasons why we should not expect the coverage of the prediction limits to be identical in calibration and validation periods in real
Pakyuz-Charrier, Evren; Lindsay, Mark; Ogarko, Vitaliy; Giraud, Jeremie; Jessell, Mark
2018-04-01
Three-dimensional (3-D) geological structural modeling aims to determine geological information in a 3-D space using structural data (foliations and interfaces) and topological rules as inputs. This is necessary in any project in which the properties of the subsurface matters; they express our understanding of geometries in depth. For that reason, 3-D geological models have a wide range of practical applications including but not restricted to civil engineering, the oil and gas industry, the mining industry, and water management. These models, however, are fraught with uncertainties originating from the inherent flaws of the modeling engines (working hypotheses, interpolator's parameterization) and the inherent lack of knowledge in areas where there are no observations combined with input uncertainty (observational, conceptual and technical errors). Because 3-D geological models are often used for impactful decision-making it is critical that all 3-D geological models provide accurate estimates of uncertainty. This paper's focus is set on the effect of structural input data measurement uncertainty propagation in implicit 3-D geological modeling. This aim is achieved using Monte Carlo simulation for uncertainty estimation (MCUE), a stochastic method which samples from predefined disturbance probability distributions that represent the uncertainty of the original input data set. MCUE is used to produce hundreds to thousands of altered unique data sets. The altered data sets are used as inputs to produce a range of plausible 3-D models. The plausible models are then combined into a single probabilistic model as a means to propagate uncertainty from the input data to the final model. In this paper, several improved methods for MCUE are proposed. The methods pertain to distribution selection for input uncertainty, sample analysis and statistical consistency of the sampled distribution. Pole vector sampling is proposed as a more rigorous alternative than dip vector
Directory of Open Access Journals (Sweden)
E. Pakyuz-Charrier
2018-04-01
Full Text Available Three-dimensional (3-D geological structural modeling aims to determine geological information in a 3-D space using structural data (foliations and interfaces and topological rules as inputs. This is necessary in any project in which the properties of the subsurface matters; they express our understanding of geometries in depth. For that reason, 3-D geological models have a wide range of practical applications including but not restricted to civil engineering, the oil and gas industry, the mining industry, and water management. These models, however, are fraught with uncertainties originating from the inherent flaws of the modeling engines (working hypotheses, interpolator's parameterization and the inherent lack of knowledge in areas where there are no observations combined with input uncertainty (observational, conceptual and technical errors. Because 3-D geological models are often used for impactful decision-making it is critical that all 3-D geological models provide accurate estimates of uncertainty. This paper's focus is set on the effect of structural input data measurement uncertainty propagation in implicit 3-D geological modeling. This aim is achieved using Monte Carlo simulation for uncertainty estimation (MCUE, a stochastic method which samples from predefined disturbance probability distributions that represent the uncertainty of the original input data set. MCUE is used to produce hundreds to thousands of altered unique data sets. The altered data sets are used as inputs to produce a range of plausible 3-D models. The plausible models are then combined into a single probabilistic model as a means to propagate uncertainty from the input data to the final model. In this paper, several improved methods for MCUE are proposed. The methods pertain to distribution selection for input uncertainty, sample analysis and statistical consistency of the sampled distribution. Pole vector sampling is proposed as a more rigorous alternative than
International Nuclear Information System (INIS)
Chojnacki, E.; Benoit, J.P.
2007-01-01
Best estimate computer codes are increasingly used in nuclear industry for the accident management procedures and have been planned to be used for the licensing procedures. Contrary to conservative codes which are supposed to give penalizing results, best estimate codes attempt to calculate accidental transients in a realistic way. It becomes therefore of prime importance, in particular for technical organization as IRSN in charge of safety assessment, to know the uncertainty on the results of such codes. Thus, CSNI has sponsored few years ago (published in 1998) the Uncertainty Methods Study (UMS) program on uncertainty methodologies used for a SBLOCA transient (LSTF-CL-18) and is now supporting the BEMUSE program for a LBLOCA transient (LOFT-L2-5). The large majority of BEMUSE participants (9 out of 10) use uncertainty methodologies based on a probabilistic modelling and all of them use Monte-Carlo simulations to propagate the uncertainties through their computer codes. Also, all of 'probabilistic participants' intend to use order statistics to determine the sampling size of the Monte-Carlo simulation and to derive the uncertainty ranges associated to their computer calculations. The first aim of this paper is to remind the advantages and also the assumptions of the probabilistic modelling and more specifically of order statistics (as Wilks' formula) in uncertainty methodologies. Indeed Monte-Carlo methods provide flexible and extremely powerful techniques for solving many of the uncertainty propagation problems encountered in nuclear safety analysis. However it is important to keep in mind that probabilistic methods are data intensive. That means, probabilistic methods cannot produce robust results unless a considerable body of information has been collected. A main interest of the use of order statistics results is to allow to take into account an unlimited number of uncertain parameters and, from a restricted number of code calculations to provide statistical
Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon
2018-01-01
The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.
Catastrophic event modeling. [lithium thionyl chloride batteries
Frank, H. A.
1981-01-01
A mathematical model for the catastrophic failures (venting or explosion of the cell) in lithium thionyl chloride batteries is presented. The phenomenology of the various processes leading to cell failure is reviewed.
Application of Catastrophe Risk Modelling to Evacuation Public Policy
Woo, G.
2009-04-01
The decision by civic authorities to evacuate an area threatened by a natural hazard is especially fraught when the population in harm's way is extremely large, and where there is considerable uncertainty in the spatial footprint, scale, and strike time of a hazard event. Traditionally viewed as a hazard forecasting issue, civil authorities turn to scientists for advice on a potentially imminent dangerous event. However, the level of scientific confidence varies enormously from one peril and crisis situation to another. With superior observational data, meteorological and hydrological hazards are generally better forecast than geological hazards. But even with Atlantic hurricanes, the track and intensity of a hurricane can change significantly within a few hours. This complicated and delayed the decision to call an evacuation of New Orleans when threatened by Hurricane Katrina, and would present a severe dilemma if a major hurricane were appearing to head for New York. Evacuation needs to be perceived as a risk issue, requiring the expertise of catastrophe risk modellers as well as geoscientists. Faced with evidence of a great earthquake in the Indian Ocean in December 2004, seismologists were reluctant to give a tsunami warning without more direct sea observations. Yet, from a risk perspective, the risk to coastal populations would have warranted attempts at tsunami warning, even though there was significant uncertainty in the hazard forecast, and chance of a false alarm. A systematic coherent risk-based framework for evacuation decision-making exists, which weighs the advantages of an evacuation call against the disadvantages. Implicitly and qualitatively, such a cost-benefit analysis is undertaken by civic authorities whenever an evacuation is considered. With the progress in catastrophe risk modelling, such an analysis can be made explicit and quantitative, providing a transparent audit trail for the decision process. A stochastic event set, the core of a
International Nuclear Information System (INIS)
Povilaitis, Mantas; Kelm, Stephan; Urbonavičius, Egidijus
2017-01-01
Highlights: • Uncertainty and sensitivity analysis for the Generic Containment severe accident. • Comparison of the analysis results with the uncertainties based in the user effect. • Demonstration of the similar importance of both the reducing the user effect and input uncertainties. - Abstract: Uncertainties in safety assessment of the nuclear power plants using computer codes come from several sources: choice of computer code, user effect (a strong impact of user choices on the simulation’s outcome) and uncertainty of various physical parameters. The “Generic Containment” activity was performed in the frames of the EU-FP7 project SARNET2 to investigate the influence of user effect and computer code choice on the results on the nuclear power plant scale. During this activity, a Generic Containment nodalisation was developed and used for exercise by the participants applying various computer codes. Even though the model of the Generic Containment and the transient scenario were precisely and uniquely defined, considerably different results were obtained not only among different codes but also among participants using the same code, showing significant influence of the user effect. This paper present analysis, which is an extension of the “Generic Containment” benchmark and investigates the effect of input parameter’s uncertainties in comparison to the user effect. Calculations were performed using the computer code ASTEC, the uncertainty and sensitivity of the results were estimated using GRS method and tool SUSA. The results of the present analysis show, that while there are differences between the uncertainty bands of the parameters, in general the deviation bands caused by parameters’ uncertainty and the user effect are comparable and of the same order. The properties of concrete and the surface areas may have more influence on containment pressure than the user effect and choice of computer code as identified in the SARNET2 Generic
Trendafiloski, G.; Gaspa Rebull, O.; Ewing, C.; Podlaha, A.; Magee, B.
2012-04-01
Calibration and validation are crucial steps in the production of the catastrophe models for the insurance industry in order to assure the model's reliability and to quantify its uncertainty. Calibration is needed in all components of model development including hazard and vulnerability. Validation is required to ensure that the losses calculated by the model match those observed in past events and which could happen in future. Impact Forecasting, the catastrophe modelling development centre of excellence within Aon Benfield, has recently launched its earthquake model for Algeria as a part of the earthquake model for the Maghreb region. The earthquake model went through a detailed calibration process including: (1) the seismic intensity attenuation model by use of macroseismic observations and maps from past earthquakes in Algeria; (2) calculation of the country-specific vulnerability modifiers by use of past damage observations in the country. The use of Benouar, 1994 ground motion prediction relationship was proven as the most appropriate for our model. Calculation of the regional vulnerability modifiers for the country led to 10% to 40% larger vulnerability indexes for different building types compared to average European indexes. The country specific damage models also included aggregate damage models for residential, commercial and industrial properties considering the description of the buildings stock given by World Housing Encyclopaedia and the local rebuilding cost factors equal to 10% for damage grade 1, 20% for damage grade 2, 35% for damage grade 3, 75% for damage grade 4 and 100% for damage grade 5. The damage grades comply with the European Macroseismic Scale (EMS-1998). The model was validated by use of "as-if" historical scenario simulations of three past earthquake events in Algeria M6.8 2003 Boumerdes, M7.3 1980 El-Asnam and M7.3 1856 Djidjelli earthquake. The calculated return periods of the losses for client market portfolio align with the
International Nuclear Information System (INIS)
Andres, T.H.
2002-05-01
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Energy Technology Data Exchange (ETDEWEB)
Andres, T.H
2002-05-01
This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)
Catastrophic avalanches and methods of their control
Directory of Open Access Journals (Sweden)
N. A. Volodicheva
2014-01-01
Full Text Available Definition of such phenomenon as “catastrophic avalanche” is presented in this arti-cle. Several situations with releases of catastrophic avalanches in mountains of Caucasus, Alps, and Central Asia are investigated. Materials of snow-avalanche ob-servations performed since 1960s at the Elbrus station of the Lomonosov Moscow State University (Central Caucasus were used for this work. Complex-valued measures of engineering protection demonstrating different efficiencies are consid-ered.
Pricing catastrophic bonds for earthquakes in Mexico
Cabrera, Brenda López
2006-01-01
After the occurrence of a natural disaster, the reconstruction can be financed with catastrophic bonds (CAT bonds) or reinsurance. For insurers, reinsurers and other corporations CAT bonds provide multi year protection without the credit risk present in reinsurance. For investors CAT bonds offer attractive returns and reduction of portfolio risk, since CAT bonds defaults are uncorrelated with defaults of other securities. As the study of natural catastrophe models plays an important role in t...
PARAMETRIC INSURANCE COVER FOR NATURAL CATASTROPHE RISKS
Directory of Open Access Journals (Sweden)
Serghei Margulescu
2013-11-01
Full Text Available With economic losses of over USD 370 bn caused by 325 catastrophic events, 2011 ranks as the worst ever year in terms of costs to society due to natural catastrophes and man-made disasters. Inthe same time, 2011 is the second most expensive year in the history for the insurance industry, with insured losses from catastrophic events amounting to USD 116 bn. Both the high level of damages and insured losses, as well as the unprecedented gap between the two values, made insurers and reinsurers worldwide to understand that some risks had so far been underestimated and they have to be better integrated in the catastrophes modelling.On the other hand, governments have to protect themselves against the financial impact of natural catastrophes and new forms of cooperation between the public and private sectors can help countries finance disaster risks. Viewed in a country’s wider risk management context, the purchase of parametric insurance cover, which transfers natural catastrophe risk to the private sector using an index- based trigger, is a necessary shift towards a pre-emptive risk management strategy. This kind of approach can be pursued by central governments or at the level of provincial or municipal governments, and a number of case studies included in the publication “Closing the financial gap” by Swiss Re (2011 illustrates how new forms of parametric insurance can help countries finance disaster risks.
Energy Technology Data Exchange (ETDEWEB)
Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)
1999-04-30
This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.
International Nuclear Information System (INIS)
Kovscek, A.R.; Wang, Y.
2005-01-01
Carbon dioxide (CO 2 ) is already injected into a limited class of reservoirs for oil recovery purposes; however, the engineering design question for simultaneous oil recovery and storage of anthropogenic CO 2 is significantly different from that of oil recovery alone. Currently, the volumes of CO 2 injected solely for oil recovery are minimized due to the purchase cost of CO 2 . If and when CO 2 emissions to the atmosphere are managed, it will be necessary to maximize simultaneously both economic oil recovery and the volumes of CO 2 emplaced in oil reservoirs. This process is coined 'cooptimization'. This paper proposes a work flow for cooptimization of oil recovery and geologic CO 2 storage. An important component of the work flow is the assessment of uncertainty in predictions of performance. Typical methods for quantifying uncertainty employ exhaustive flow simulation of multiple stochastic realizations of the geologic architecture of a reservoir. Such approaches are computationally intensive and thereby time consuming. An analytic streamline based proxy for full reservoir simulation is proposed and tested. Streamline trajectories represent the three-dimensional velocity field during multiphase flow in porous media and so are useful for quantifying the similarity and differences among various reservoir models. The proxy allows rational selection of a representative subset of equi-probable reservoir models that encompass uncertainty with respect to true reservoir geology. The streamline approach is demonstrated to be thorough and rapid
International Nuclear Information System (INIS)
Ramarohetra, Johanna; Pohl, Benjamin; Sultan, Benjamin
2015-01-01
The challenge of estimating the potential impacts of climate change has led to an increasing use of dynamical downscaling to produce fine spatial-scale climate projections for impact assessments. In this work, we analyze if and to what extent the bias in the simulated crop yield can be reduced by using the Weather Research and Forecasting (WRF) regional climate model to downscale ERA-Interim (European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis) rainfall and radiation data. Then, we evaluate the uncertainties resulting from both the choice of the physical parameterizations of the WRF model and its internal variability. Impact assessments were performed at two sites in Sub-Saharan Africa and by using two crop models to simulate Niger pearl millet and Benin maize yields. We find that the use of the WRF model to downscale ERA-Interim climate data generally reduces the bias in the simulated crop yield, yet this reduction in bias strongly depends on the choices in the model setup. Among the physical parameterizations considered, we show that the choice of the land surface model (LSM) is of primary importance. When there is no coupling with a LSM, or when the LSM is too simplistic, the simulated precipitation and then the simulated yield are null, or respectively very low; therefore, coupling with a LSM is necessary. The convective scheme is the second most influential scheme for yield simulation, followed by the shortwave radiation scheme. The uncertainties related to the internal variability of the WRF model are also significant and reach up to 30% of the simulated yields. These results suggest that regional models need to be used more carefully in order to improve the reliability of impact assessments. (letter)
International Nuclear Information System (INIS)
Mara, S.J.
1980-03-01
SRI International has projected the rate, duration, and magnitude of geomorphic processes and events in the Southwest and Gulf Coast over the next million years. This information will be used by the Department of Energy's Pacific Northwest Laboratory (PNL) as input to a computer model, which will be used to simulate possible release scenarios and the consequences of the release of nuclear waste from geologic containment. The estimates in this report, although based on best scientific judgment, are subject to considerable uncertainty. An evaluation of the Quaternary history of the two study areas revealed that each had undergone geomorphic change in the last one million years. Catastrophic events were evaluated in order to determine their significance to the simulation model. Given available data, catastrophic floods are not expected to occur in the two study areas. Catastrophic landslides may occur in the Southwest, but because the duration of the event is brief and the amount of material moved is small in comparison to regional denudation, such events need not be included in the simulation model. Ashfalls, however, could result in removal of vegetation from the landscape, thereby causing significant increases in erosion rates. Because the estimates developed during this study may not be applicable to specific sites, general equations were presented as a first step in refining the analysis. These equations identify the general relationships among the important variables and suggest those areas of concern for which further data are required. If the current model indicates that geomorphic processes (taken together with other geologic changes) may ultimately affect the geologic containment of nuclear waste, further research may be necessary to refine this analysis for application to specific sites
International Nuclear Information System (INIS)
Lovius, L.; Norman, S.; Kjellbert, N.
1990-02-01
An assessment has been made of the impact of spatial variability on the performance of a KBS-3 type repository. The uncertainties in geohydrologically related performance measures have been investigated using conductivity data from one of the Swedish study sites. The analysis was carried out with the PROPER code and the FSCF10 submodel. (authors)
Directory of Open Access Journals (Sweden)
Navid Hooshangi
2018-01-01
Full Text Available Agent-based modeling is a promising approach for developing simulation tools for natural hazards in different areas, such as during urban search and rescue (USAR operations. The present study aimed to develop a dynamic agent-based simulation model in post-earthquake USAR operations using geospatial information system and multi agent systems (GIS and MASs, respectively. We also propose an approach for dynamic task allocation and establishing collaboration among agents based on contract net protocol (CNP and interval-based Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS methods, which consider uncertainty in natural hazards information during agents’ decision-making. The decision-making weights were calculated by analytic hierarchy process (AHP. In order to implement the system, earthquake environment was simulated and the damage of the buildings and a number of injuries were calculated in Tehran’s District 3: 23%, 37%, 24% and 16% of buildings were in slight, moderate, extensive and completely vulnerable classes, respectively. The number of injured persons was calculated to be 17,238. Numerical results in 27 scenarios showed that the proposed method is more accurate than the CNP method in the terms of USAR operational time (at least 13% decrease and the number of human fatalities (at least 9% decrease. In interval uncertainty analysis of our proposed simulated system, the lower and upper bounds of uncertain responses are evaluated. The overall results showed that considering uncertainty in task allocation can be a highly advantageous in the disaster environment. Such systems can be used to manage and prepare for natural hazards.
Non-catastrophic and catastrophic fractures in racing Thoroughbreds at the Hong Kong Jockey Club.
Sun, T C; Riggs, C M; Cogger, N; Wright, J; Al-Alawneh, J I
2018-04-19
Reports of fractures in racehorses have predominantly focused on catastrophic injuries, and there is limited data identifying the location and incidence of fractures that did not result in a fatal outcome. To describe the nature and the incidence of non-catastrophic and catastrophic fractures in Thoroughbreds racing at the Hong Kong Jockey Club (HKJC) over seven racing seasons. Retrospective cohort study. Data of fractures sustained in horses while racing and of race characteristics were extracted from the HKJC Veterinary Management Information System (VMIS) and Racing Information System (RIS) respectively. The fracture event was determined from the first clinical entry for each specific injury. The incidence rates of non-catastrophic and catastrophic fractures were calculated per 1000 racing starts for racetrack, age, racing season, sex and trainer. 179 first fracture events occurred in 64,807 racing starts. The incidence rate of non-catastrophic fractures was 2.2 per 1000 racing starts and of catastrophic fractures was 0.6 per 1000 racing starts. Fractures of the proximal sesamoid bones represented 55% of all catastrophic fractures while the most common non-catastrophic fractures involved the carpus and the first phalanx. Significant associations were detected between the incidence of non-catastrophic fractures and sex, trainer and racing season. The first fracture event was used to calculate the incidence rate in this study and may have resulted in underestimation of the true incidence rate of fractures in this population. However, given the low number of recorded fracture events compared to the size of the study population, this underestimation is likely to be small. There were 3.6 times as many non-catastrophic fractures as catastrophic fractures in Thoroughbreds racing in Hong Kong between 2004 and 2011. Non-catastrophic fractures interfere with race training schedules and may predispose to catastrophic fracture. Future analytical studies on non-catastrophic
DOWNWARD CATASTROPHE OF SOLAR MAGNETIC FLUX ROPES
Energy Technology Data Exchange (ETDEWEB)
Zhang, Quanhao; Wang, Yuming; Hu, Youqiu; Liu, Rui, E-mail: zhangqh@mail.ustc.edu.cn [CAS Key Laboratory of Geospace Environment, Department of Geophysics and Planetary Sciences, University of Science and Technology of China, Hefei 230026 (China)
2016-07-10
2.5-dimensional time-dependent ideal magnetohydrodynamic (MHD) models in Cartesian coordinates were used in previous studies to seek MHD equilibria involving a magnetic flux rope embedded in a bipolar, partially open background field. As demonstrated by these studies, the equilibrium solutions of the system are separated into two branches: the flux rope sticks to the photosphere for solutions at the lower branch but is suspended in the corona for those at the upper branch. Moreover, a solution originally at the lower branch jumps to the upper, as the related control parameter increases and reaches a critical value, and the associated jump is here referred to as an upward catastrophe. The present paper advances these studies in three aspects. First, the magnetic field is changed to be force-free; the system still experiences an upward catastrophe with an increase in each control parameter. Second, under the force-free approximation, there also exists a downward catastrophe, characterized by the jump of a solution from the upper branch to the lower. Both catastrophes are irreversible processes connecting the two branches of equilibrium solutions so as to form a cycle. Finally, the magnetic energy in the numerical domain is calculated. It is found that there exists a magnetic energy release for both catastrophes. The Ampère's force, which vanishes everywhere for force-free fields, appears only during the catastrophes and does positive work, which serves as a major mechanism for the energy release. The implications of the downward catastrophe and its relevance to solar activities are briefly discussed.
DOWNWARD CATASTROPHE OF SOLAR MAGNETIC FLUX ROPES
International Nuclear Information System (INIS)
Zhang, Quanhao; Wang, Yuming; Hu, Youqiu; Liu, Rui
2016-01-01
2.5-dimensional time-dependent ideal magnetohydrodynamic (MHD) models in Cartesian coordinates were used in previous studies to seek MHD equilibria involving a magnetic flux rope embedded in a bipolar, partially open background field. As demonstrated by these studies, the equilibrium solutions of the system are separated into two branches: the flux rope sticks to the photosphere for solutions at the lower branch but is suspended in the corona for those at the upper branch. Moreover, a solution originally at the lower branch jumps to the upper, as the related control parameter increases and reaches a critical value, and the associated jump is here referred to as an upward catastrophe. The present paper advances these studies in three aspects. First, the magnetic field is changed to be force-free; the system still experiences an upward catastrophe with an increase in each control parameter. Second, under the force-free approximation, there also exists a downward catastrophe, characterized by the jump of a solution from the upper branch to the lower. Both catastrophes are irreversible processes connecting the two branches of equilibrium solutions so as to form a cycle. Finally, the magnetic energy in the numerical domain is calculated. It is found that there exists a magnetic energy release for both catastrophes. The Ampère's force, which vanishes everywhere for force-free fields, appears only during the catastrophes and does positive work, which serves as a major mechanism for the energy release. The implications of the downward catastrophe and its relevance to solar activities are briefly discussed.
Severe catastrophes and public reactions
International Nuclear Information System (INIS)
Osmachkin, Vitaly
2002-01-01
nuclear opposition. Economical basis of nuclear energy stagnation is in not very successful competition of nuclear engineering with fossil energy production technologies. Much money has been spent for improvement of safety of NPPs. Social roots of the opposition are linked with a bad experience of the public with demonstration of the nuclear energy- The explosion of atomic bombs, some contamination of the territories after nuclear arm tests, misfortunes with TMI-2 and Chernobyl have created a stable enmity and non-acceptance of the all connected with 'atom'. The mass media have strongly promoted the dissemination of the fear of radiation exposures. There is also an influence on that attitude the radiation protection regulation via the declaration of the linear no-threshold dependence of the radiation detriments and dose of exposure. Such concept ignores the adoptive features of all living. But modem studies have showed that protracted irradiation at the same dose is much less dangerous compared with sharp one. It could change public attitude to nuclear energy in the society. Role of nuclear communication for public informing: The reactions of public on various technological and man-made events differ significantly and are being determined not scales of catastrophes but the mental impression and a multiplication of psychological stresses in the society by mass -media. In present situation a nuclear community has to improve the contacts with the pubic, to launch more effective campaign for explanation of real adventures of nuclear power. It needs to compare the risks of climate warming and health detriments from different electricity production technologies and to show that nuclear power is a single alternative all fossil burning techniques of electricity production. It's the truth the nuclear power is a real method of fight for suppression of emission the greenhouse gases, isn't it? (author)
Environmental catastrophes under time-inconsistent preference
Energy Technology Data Exchange (ETDEWEB)
Michielsen, T.
2013-02-15
I analyze optimal natural resource use in an intergenerational model with the risk of a catastrophe. Each generation maximizes a weighted sum of discounted utility (positive) and the probability that a catastrophe will occur at any point in the future (negative). The model generates time inconsistency as generations disagree on the relative weights on utility and catastrophe prevention. As a consequence, future generations emit too much from the current generation's perspective and a dynamic game ensues. I consider a sequence of models. When the environmental problem is related to a scarce exhaustible resource, early generations have an incentive to reduce emissions in Markov equilibrium in order to enhance the ecosystem's resilience to future emissions. When the pollutant is expected to become obsolete in the near future, early generations may however increase their emissions if this reduces future emissions. When polluting inputs are abundant and expected to remain essential, the catastrophe becomes a self-fulfilling prophecy and the degree of concern for catastrophe prevention has limited or even no effect on equilibrium behaviour.
An application of Mean Escape Time and metapopulation on forestry catastrophe insurance
Li, Jiangcheng; Zhang, Chunmin; Liu, Jifa; Li, Zhen; Yang, Xuan
2018-04-01
A forestry catastrophe insurance model due to forestry pest infestations and disease epidemics is developed by employing metapopulation dynamics and statistics properties of Mean Escape Time (MET). The probability of outbreak of forestry catastrophe loss and the catastrophe loss payment time with MET are respectively investigated. Forestry loss data in China is used for model simulation. Experimental results are concluded as: (1) The model with analytical results is shown to be a better fit; (2) Within the condition of big area of patches and structure of patches, high system factor, low extinction rate, high multiplicative noises, and additive noises with a high cross-correlated strength range, an outbreak of forestry catastrophe loss or catastrophe loss payment due to forestry pest infestations and disease epidemics could occur; (3) An optimal catastrophe loss payment time MET due to forestry pest infestations and disease epidemics can be identified by taking proper value of multiplicative noises and limits the additive noises on a low range of value, and cross-correlated strength at a high range of value.
Pricing Zero-Coupon Catastrophe Bonds Using EVT with Doubly Stochastic Poisson Arrivals
Directory of Open Access Journals (Sweden)
Zonggang Ma
2017-01-01
Full Text Available The frequency and severity of climate abnormal change displays an irregular upward cycle as global warming intensifies. Therefore, this paper employs a doubly stochastic Poisson process with Black Derman Toy (BDT intensity to describe the catastrophic characteristics. By using the Property Claim Services (PCS loss index data from 2001 to 2010 provided by the US Insurance Services Office (ISO, the empirical result reveals that the BDT arrival rate process is superior to the nonhomogeneous Poisson and lognormal intensity process due to its smaller RMSE, MAE, MRPE, and U and larger E and d. Secondly, to depict extreme features of catastrophic risks, this paper adopts the Peak Over Threshold (POT in extreme value theory (EVT to characterize the tail characteristics of catastrophic loss distribution. And then the loss distribution is analyzed and assessed using a quantile-quantile (QQ plot to visually check whether the PCS index observations meet the generalized Pareto distribution (GPD assumption. Furthermore, this paper derives a pricing formula for zero-coupon catastrophe bonds with a stochastic interest rate environment and aggregate losses generated by a compound doubly stochastic Poisson process under the forward measure. Finally, simulation results verify pricing model predictions and show how catastrophic risks and interest rate risk affect the prices of zero-coupon catastrophe bonds.
Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks.
Directory of Open Access Journals (Sweden)
Roby Velez
Full Text Available A long-term goal of AI is to produce agents that can learn a diversity of skills throughout their lifetimes and continuously improve those skills via experience. A longstanding obstacle towards that goal is catastrophic forgetting, which is when learning new information erases previously learned information. Catastrophic forgetting occurs in artificial neural networks (ANNs, which have fueled most recent advances in AI. A recent paper proposed that catastrophic forgetting in ANNs can be reduced by promoting modularity, which can limit forgetting by isolating task information to specific clusters of nodes and connections (functional modules. While the prior work did show that modular ANNs suffered less from catastrophic forgetting, it was not able to produce ANNs that possessed task-specific functional modules, thereby leaving the main theory regarding modularity and forgetting untested. We introduce diffusion-based neuromodulation, which simulates the release of diffusing, neuromodulatory chemicals within an ANN that can modulate (i.e. up or down regulate learning in a spatial region. On the simple diagnostic problem from the prior work, diffusion-based neuromodulation 1 induces task-specific learning in groups of nodes and connections (task-specific localized learning, which 2 produces functional modules for each subtask, and 3 yields higher performance by eliminating catastrophic forgetting. Overall, our results suggest that diffusion-based neuromodulation promotes task-specific localized learning and functional modularity, which can help solve the challenging, but important problem of catastrophic forgetting.
Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks
Clune, Jeff
2017-01-01
A long-term goal of AI is to produce agents that can learn a diversity of skills throughout their lifetimes and continuously improve those skills via experience. A longstanding obstacle towards that goal is catastrophic forgetting, which is when learning new information erases previously learned information. Catastrophic forgetting occurs in artificial neural networks (ANNs), which have fueled most recent advances in AI. A recent paper proposed that catastrophic forgetting in ANNs can be reduced by promoting modularity, which can limit forgetting by isolating task information to specific clusters of nodes and connections (functional modules). While the prior work did show that modular ANNs suffered less from catastrophic forgetting, it was not able to produce ANNs that possessed task-specific functional modules, thereby leaving the main theory regarding modularity and forgetting untested. We introduce diffusion-based neuromodulation, which simulates the release of diffusing, neuromodulatory chemicals within an ANN that can modulate (i.e. up or down regulate) learning in a spatial region. On the simple diagnostic problem from the prior work, diffusion-based neuromodulation 1) induces task-specific learning in groups of nodes and connections (task-specific localized learning), which 2) produces functional modules for each subtask, and 3) yields higher performance by eliminating catastrophic forgetting. Overall, our results suggest that diffusion-based neuromodulation promotes task-specific localized learning and functional modularity, which can help solve the challenging, but important problem of catastrophic forgetting. PMID:29145413
Zeeman catastrophe machines as a toolkit for teaching chaos
International Nuclear Information System (INIS)
Nagy, Péter; Tasnádi, Péter
2014-01-01
The investigation of chaotic motions and cooperative systems offers a magnificent opportunity to involve modern physics in the basic course of mechanics taught to engineering students. In this paper, it will be demonstrated that the Zeeman machine can be a versatile and motivating tool for students to acquire introductory knowledge about chaotic motion via interactive simulations. The Zeeman catastrophe machine is a typical example of a quasi-static system with hysteresis. It works in a relatively simple way and its properties can be understood very easily. Since the machine can be built easily and the simulation of its movement is also simple, the experimental investigation and the theoretical description can be connected intuitively. Although the Zeeman machine is known mainly for its quasi-static and catastrophic behaviour, its dynamic properties are also of interest with its typical chaotic features. By means of a periodically driven Zeeman machine, a wide range of chaotic properties of the simple systems can be demonstrated, such as bifurcation diagrams, chaotic attractors, transient chaos, Lyapunov exponents and so on. This paper is organically linked to our website (http://csodafizika.hu/zeeman) where the discussed simulation programs can be downloaded. In a second paper, the novel construction of a network of Zeeman machines will be presented to study the properties of cooperative systems. (paper)
On the governance of global and catastrophic risks
DEFF Research Database (Denmark)
Faber, Michael Havbro
2011-01-01
The focus of the present paper regards the identification and treatment of critical issues in the process of societal decision making concerning management of global and catastrophic risks. Taking basis in recent works by the author, the paper in particular addresses: 1) Which are the most relevant...... hazards in a holistic global perspective and how may these be categorised in view of strategies for their treatment?; 2) How might robust societal decisions on risk management subject to large uncertainties be formally supported?; 3) How may available economic resources be prioritised for the purpose...... of sustainable and global life safety and health improvements? Finally, new results and perspectives are presented on the issue of allocation of resources for the purpose of improving global public health and a discussion on global risk governance concludes the paper....
Lindley, Dennis V
2013-01-01
Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.
[Bioethics in catastrophe situations such as earthquakes].
León, C Francisco Javier
2012-01-01
A catastrophe of the magnitude of the earthquake and tsunami that hit Chile not long ago, forces us to raise some questions that we will try to answer from a philosophical, ethical and responsibility viewpoints. An analysis of the basic principles of bioethics is also justified. A natural catastrophe is not, by itself, moral or immoral, fair or unfair. However, its consequences could certainly be regarded as such, depending on whether they could have been prevented or mitigated. We will identify those individuals, who have the ethical responsibility to attend the victims and the ethical principles that must guide the tasks of healthcare and psychological support teams. The minimal indispensable actions to obtain an adequate social and legal protection of vulnerable people, must be defined according to international guidelines. These reflections are intended to improve the responsibility of the State and all the community, to efficiently prevent and repair the material and psychological consequences of such a catastrophe.
The catastrophic antiphospholipid syndrome in children.
Go, Ellen J L; O'Neil, Kathleen M
2017-09-01
To review the difficult syndrome of catastrophic antiphospholipid syndrome, emphasizing new developments in the diagnosis, pathogenesis and treatment. Few recent publications directly address pediatric catastrophic antiphospholipid syndrome (CAPS). Most articles are case reports or are data from adult and pediatric registries. The major factors contributing to most pediatric catastrophic antiphospholipid syndrome include infection and the presence of antiphospholipid antibodies, but complement activation also is important in creating diffuse thrombosis in the microcirculation. Treatment of the acute emergency requires anticoagulation, suppression of the hyperinflammatory state and elimination of the triggering infection. Inhibition of complement activation appears to improve outcome in limited studies, and suppression of antiphospholipid antibody formation may be important in long-term management. CAPS, an antibody-mediated diffuse thrombotic disease of microvasculature, is rare in childhood but has high mortality (33-50%). It requires prompt recognition and aggressive multimodality treatment, including anticoagulation, anti-inflammatory therapy and elimination of inciting infection and pathogenic autoantibodies.
Valuing Catastrophe Bonds Involving Credit Risks
Directory of Open Access Journals (Sweden)
Jian Liu
2014-01-01
Full Text Available Catastrophe bonds are the most important products in catastrophe risk securitization market. For the operating mechanism, CAT bonds may have a credit risk, so in this paper we consider the influence of the credit risk on CAT bonds pricing that is different from the other literature. We employ the Jarrow and Turnbull method to model the credit risks and get access to the general pricing formula using the Extreme Value Theory. Furthermore, we present an empirical pricing study of the Property Claim Services data, where the parameters in the loss function distribution are estimated by the MLE method and the default probabilities are deduced by the US financial market data. Then we get the catastrophe bonds value by the Monte Carlo method.
Orthogonality catastrophe and fractional exclusion statistics
Ares, Filiberto; Gupta, Kumar S.; de Queiroz, Amilcar R.
2018-02-01
We show that the N -particle Sutherland model with inverse-square and harmonic interactions exhibits orthogonality catastrophe. For a fixed value of the harmonic coupling, the overlap of the N -body ground state wave functions with two different values of the inverse-square interaction term goes to zero in the thermodynamic limit. When the two values of the inverse-square coupling differ by an infinitesimal amount, the wave function overlap shows an exponential suppression. This is qualitatively different from the usual power law suppression observed in the Anderson's orthogonality catastrophe. We also obtain an analytic expression for the wave function overlaps for an arbitrary set of couplings, whose properties are analyzed numerically. The quasiparticles constituting the ground state wave functions of the Sutherland model are known to obey fractional exclusion statistics. Our analysis indicates that the orthogonality catastrophe may be valid in systems with more general kinds of statistics than just the fermionic type.
Catastrophic Disruption Threshold and Maximum Deflection from Kinetic Impact
Cheng, A. F.
2017-12-01
The use of a kinetic impactor to deflect an asteroid on a collision course with Earth was described in the NASA Near-Earth Object Survey and Deflection Analysis of Alternatives (2007) as the most mature approach for asteroid deflection and mitigation. The NASA DART mission will demonstrate asteroid deflection by kinetic impact at the Potentially Hazardous Asteroid 65803 Didymos in October, 2022. The kinetic impactor approach is considered to be applicable with warning times of 10 years or more and with hazardous asteroid diameters of 400 m or less. In principle, a larger kinetic impactor bringing greater kinetic energy could cause a larger deflection, but input of excessive kinetic energy will cause catastrophic disruption of the target, leaving possibly large fragments still on collision course with Earth. Thus the catastrophic disruption threshold limits the maximum deflection from a kinetic impactor. An often-cited rule of thumb states that the maximum deflection is 0.1 times the escape velocity before the target will be disrupted. It turns out this rule of thumb does not work well. A comparison to numerical simulation results shows that a similar rule applies in the gravity limit, for large targets more than 300 m, where the maximum deflection is roughly the escape velocity at momentum enhancement factor β=2. In the gravity limit, the rule of thumb corresponds to pure momentum coupling (μ=1/3), but simulations find a slightly different scaling μ=0.43. In the smaller target size range that kinetic impactors would apply to, the catastrophic disruption limit is strength-controlled. A DART-like impactor won't disrupt any target asteroid down to significantly smaller size than the 50 m below which a hazardous object would not penetrate the atmosphere in any case unless it is unusually strong.
Measurement uncertainty and probability
Willink, Robin
2013-01-01
A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.
Timber price dynamics following a natural catastrophe
Jeffrey P. Prestemon; Thomas P. Holmes
2000-01-01
Catastrophic shocks to existing stocks of a renewable resource can cause long-run price shifts. With timber, these long-run price shifts may be accompanied by a short-run price drop due to salvage. Hurricane Hugo damaged 20 percent of southern pine timber in the South Carolina Coastal Plain in 1989. To estimate the...
Catastrophizing and Causal Beliefs in Whiplash
Buitenhuis, J.; de Jong, P. J.; Jaspers, J. P. C.; Groothoff, J. W.
2008-01-01
Study Design. Prospective cohort study. Objective. This study investigates the role of pain catastrophizing and causal beliefs with regard to severity and persistence of neck complaints after motor vehicle accidents. Summary of Background Data. In previous research on low back pain, somatoform
Catastrophic antiphospholipid syndrome in leprosy | Chewoolkar ...
African Journals Online (AJOL)
Catastrophic antiphospholipid syndrome is an acute and life threatening variant of antiphospholipid syndrome with a high mortality rate. Many infections are known to be accompanied by the thrombotic manifestations of this syndrome. We came across a patient of leprosy who developed bowel ischaemia secondary to ...
Catastrophic antiphospholipid syndrome: task force report summary.
Cervera, R; Rodríguez-Pintó, I
2014-10-01
The Task Force on Catastrophic Antiphospholipid Syndrome (CAPS) aimed to assess the current knowledge on pathogenesis, clinical and laboratory features, diagnosis and classification, precipitating factors and treatment of CAPS. This article summarizes the main aspects of its final report. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Catastrophic risk : Social influences on insurance decisions
Krawczyk, Michal; Trautmann, Stefan; van de Kuilen, Gijs
We study behavioral patterns of insurance demand for low-probability large-loss events (catastrophic losses). Individual patterns of belief formation and risk attitude that were suggested in the behavioral decisions literature emerge robustly in the current set of insurance choices. However, social
Koch, Michael
Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.
Lafontaine, J.; Hay, L.
2015-12-01
The United States Geological Survey (USGS) has developed a National Hydrologic Model (NHM) to support coordinated, comprehensive and consistent hydrologic model development, and facilitate the application of hydrologic simulations within the conterminous United States (CONUS). More than 1,700 gaged watersheds across the CONUS were modeled to test the feasibility of improving streamflow simulations in gaged and ungaged watersheds by linking statistically- and physically-based hydrologic models with remotely-sensed data products (i.e. - snow water equivalent) and estimates of uncertainty. Initially, the physically-based models were calibrated to measured streamflow data to provide a baseline for comparison. As many stream reaches in the CONUS are either not gaged, or are substantially impacted by water use or flow regulation, ancillary information must be used to determine reasonable parameter estimations for streamflow simulations. In addition, not all ancillary datasets are appropriate for application to all parts of the CONUS (e.g. - snow water equivalent in the southeastern U.S., where snow is a rarity). As it is not expected that any one data product or model simulation will be sufficient for representing hydrologic behavior across the entire CONUS, a systematic evaluation of which data products improve simulations of streamflow for various regions across the CONUS was performed. The resulting portfolio of calibration strategies can be used to guide selection of an appropriate combination of simulated and measured information for model development and calibration at a given location of interest. In addition, these calibration strategies have been developed to be flexible so that new data products or simulated information can be assimilated. This analysis provides a foundation to understand how well models work when streamflow data is either not available or is limited and could be used to further inform hydrologic model parameter development for ungaged areas.
Energy Technology Data Exchange (ETDEWEB)
Garcia J, T.; Cardenas V, J., E-mail: tonatiuh.garcia@cnsns.gob.mx [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Ciudad de Mexico (Mexico)
2015-09-15
A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the
Catastrophic risks and insurance in farm-level decision making
Ogurtsov, V.
2008-01-01
Keywords: risk perception, risk attitude, catastrophic risk, insurance, farm characteristics, farmer personal characteristics, utility-efficient programming, arable farming, dairy farming
Catastrophic risks can cause severe cash flow problems for farmers or even result into their
Akçay, A.E.; Biller, B.
2014-01-01
We consider an assemble-to-order production system where the product demands and the time since the last customer arrival are not independent. The simulation of this system requires a multivariate input model that generates random input vectors with correlated discrete and continuous components. In
Axial and focal-plane diffraction catastrophe integrals
International Nuclear Information System (INIS)
Berry, M V; Howls, C J
2010-01-01
Exact expressions in terms of Bessel functions are found for some of the diffraction catastrophe integrals that decorate caustics in optics and mechanics. These are the axial and focal-plane sections of the elliptic and hyperbolic umbilic diffraction catastrophes, and symmetric elliptic and hyperbolic unfoldings of the X 9 diffraction catastrophes. These representations reveal unexpected relations between the integrals.
Energy Technology Data Exchange (ETDEWEB)
Brown, C.S., E-mail: csbrown3@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, 2500 Stinson Drive, Raleigh, NC 27695-7909 (United States); Zhang, H., E-mail: Hongbin.Zhang@inl.gov [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3870 (United States); Kucukboyaci, V., E-mail: kucukbvn@westinghouse.com [Westinghouse Electric Company, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States); Sung, Y., E-mail: sungy@westinghouse.com [Westinghouse Electric Company, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States)
2016-12-01
Highlights: • Best estimate plus uncertainty (BEPU) analyses of PWR core responses under main steam line break (MSLB) accident. • CASL’s coupled neutron transport/subchannel code VERA-CS. • Wilks’ nonparametric statistical method. • MDNBR 95/95 tolerance limit. - Abstract: VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was applied to simulate core behavior of a typical Westinghouse-designed 4-loop pressurized water reactor (PWR) with 17 × 17 fuel assemblies in response to two main steam line break (MSLB) accident scenarios initiated at hot zero power (HZP) at the end of the first fuel cycle with the most reactive rod cluster control assembly stuck out of the core. The reactor core boundary conditions at the most DNB limiting time step were determined by a system analysis code. The core inlet flow and temperature distributions were obtained from computational fluid dynamics (CFD) simulations. The two MSLB scenarios consisted of the high and low flow situations, where reactor coolant pumps either continue to operate with offsite power or do not continue to operate since offsite power is unavailable. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this demonstration of BEPU application, 59 full core simulations were performed for each accident scenario to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. A parametric goodness-of-fit approach was also applied to the results to obtain the MDNBR value at the 95/95 tolerance limit. Initial sensitivity analysis was performed with the 59 cases per accident scenario by use of Pearson correlation coefficients. The results show that this typical PWR core
Energy Technology Data Exchange (ETDEWEB)
Haeggstaahl, Daniel [Maelardalen Univ., Vaesteraas (Sweden); Dotzauer, Erik [AB Fortum, Stockholm (Sweden)
2004-12-01
temperature will affect the production of heat and power is performed. The conclusion is that the local energy company will benefit from using more sophisticated planning tools. The main sources of uncertainties in production planning are: weather, power price, fuel quality and availability of production units. Methodologies that handle uncertainties are discussed. The solution may be to use stochastic optimization or to do a scenario analysis. Simulator-based production planning seems very promising, because it is easy to maintain and change the process model in the graphical user interface. However, the prototype model presented has to be further developed to become a practical tool easy to use on daily basis in the control rooms.
International Nuclear Information System (INIS)
Brown, C.S.; Zhang, H.; Kucukboyaci, V.; Sung, Y.
2016-01-01
Highlights: • Best estimate plus uncertainty (BEPU) analyses of PWR core responses under main steam line break (MSLB) accident. • CASL’s coupled neutron transport/subchannel code VERA-CS. • Wilks’ nonparametric statistical method. • MDNBR 95/95 tolerance limit. - Abstract: VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was applied to simulate core behavior of a typical Westinghouse-designed 4-loop pressurized water reactor (PWR) with 17 × 17 fuel assemblies in response to two main steam line break (MSLB) accident scenarios initiated at hot zero power (HZP) at the end of the first fuel cycle with the most reactive rod cluster control assembly stuck out of the core. The reactor core boundary conditions at the most DNB limiting time step were determined by a system analysis code. The core inlet flow and temperature distributions were obtained from computational fluid dynamics (CFD) simulations. The two MSLB scenarios consisted of the high and low flow situations, where reactor coolant pumps either continue to operate with offsite power or do not continue to operate since offsite power is unavailable. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this demonstration of BEPU application, 59 full core simulations were performed for each accident scenario to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. A parametric goodness-of-fit approach was also applied to the results to obtain the MDNBR value at the 95/95 tolerance limit. Initial sensitivity analysis was performed with the 59 cases per accident scenario by use of Pearson correlation coefficients. The results show that this typical PWR core
Maleki, Mohammad; Emery, Xavier
2017-12-01
In mineral resources evaluation, the joint simulation of a quantitative variable, such as a metal grade, and a categorical variable, such as a rock type, is challenging when one wants to reproduce spatial trends of the rock type domains, a feature that makes a stationarity assumption questionable. To address this problem, this work presents methodological and practical proposals for jointly simulating a grade and a rock type, when the former is represented by the transform of a stationary Gaussian random field and the latter is obtained by truncating an intrinsic random field of order k with Gaussian generalized increments. The proposals concern both the inference of the model parameters and the construction of realizations conditioned to existing data. The main difficulty is the identification of the spatial correlation structure, for which a semi-automated algorithm is designed, based on a least squares fitting of the data-to-data indicator covariances and grade-indicator cross-covariances. The proposed models and algorithms are applied to jointly simulate the copper grade and the rock type in a Chilean porphyry copper deposit. The results show their ability to reproduce the gradual transitions of the grade when crossing a rock type boundary, as well as the spatial zonation of the rock type.
Modeling the uncertainty of several VOC and its impact on simulated VOC and ozone in Houston, Texas
Pan, Shuai; Choi, Yunsoo; Roy, Anirban; Li, Xiangshang; Jeon, Wonbae; Souri, Amir Hossein
2015-11-01
A WRF-SMOKE-CMAQ modeling system was used to study Volatile Organic Compound (VOC) emissions and their impact on surface VOC and ozone concentrations in southeast Texas during September 2013. The model was evaluated against the ground-level Automated Gas Chromatograph (Auto-GC) measurement data from the Texas Commission on Environmental Quality (TCEQ). The comparisons indicated that the model over-predicted benzene, ethylene, toluene and xylene, while under-predicting isoprene and ethane. The mean biases between simulated and observed values of each VOC species showed clear daytime, nighttime, weekday and weekend variations. Adjusting the VOC emissions using simulated/observed ratios improved model performance of each VOC species, especially mitigating the mean bias substantially. Simulated monthly mean ozone showed a minor change: a 0.4 ppb or 1.2% increase; while a change of more than 5 ppb was seen in hourly ozone data on high ozone days, this change moved model predictions closer to observations. The CMAQ model run with the adjusted emissions better reproduced the variability in the National Aeronautics and Space Administration (NASA)'s Ozone Monitoring Instrument (OMI) formaldehyde (HCHO) columns. The adjusted model scenario also slightly better reproduced the aircraft HCHO concentrations from NASA's DISCOVER-AQ campaign conducted during the simulation episode period; Correlation, Mean Bias and RMSE improved from 0.34, 1.38 ppb and 2.15 ppb to 0.38, 1.33 ppb and 2.08 ppb respectively. A process analysis conducted for both industrial/urban and rural areas suggested that chemistry was the main process contributing to ozone production in both areas, while the impact of chemistry was smaller in rural areas than in industrial and urban areas. For both areas, the positive chemistry contribution increased in the sensitivity simulation largely due to the increase in emissions. Nudging VOC emissions to match the observed concentrations shifted the ozone hotspots
Nonlinear physics: Catastrophe, chaos and complexity
International Nuclear Information System (INIS)
Arecchi, F.T.
1992-01-01
Currently in the world of physics, there is open debate on the role of the three C's - catastrophe, chaos and complexity. Seen as new ideas or paradigms, incapable of being harmonized within the realm of traditional physics, these terms seem to be creating turmoil in the classical physics establishment whose foundations date back to the early seventeenth century. This paper first defines catastrophe, chaos and complexity and shows how these terms are all connected to nonlinear dynamics and how they have long since been present within scientific treatises. It also evidences the relationship of the three C's with the concept of organization, inappropriately called self-organization, and with recognition and decisional strategies of cognitive systems. Relevant to natural science, the development of these considerations is necessitating the re-examination of the role and capabilities of human knowledge and a return to inter-disciplinary scientific-philosophical debate
Diagnosis and management of catastrophic antiphospholipid syndrome.
Carmi, Or; Berla, Maya; Shoenfeld, Yehuda; Levy, Yair
2017-04-01
Catastrophic antiphospholipid syndrome (CAPS) is a rare, life-threatening disease. In 1992, Asherson defined it as a widespread coagulopathy related to the antiphospholipid antibodies (aPL). CAPS requires rapid diagnosis and prompt initiation of treatment. Areas covered: This paper discusses all aspects of CAPS, including its pathophysiology, clinical manifestations, diagnostic approaches, differential diagnoses, management and treatment of relapsing CAPS, and its prognosis. To obtain the information used in this review, scientific databases were searched using the key words antiphospholipid antibodies, catastrophic antiphospholipid syndrome, hemolytic anemia, lupus anticoagulant, and thrombotic microangiopathic hemolytic anemia. Expert commentary: CAPS is a rare variant of the antiphospholipid syndrome (APS). It is characterized by thrombosis in multiple organs and a cytokine storm developing over a short period, with histopathologic evidence of multiple microthromboses, and laboratory confirmation of high aPL titers. This review discusses the diagnostic challenges and current approaches to the treatment of CAPS.
Tackling The Global Challenge: Humanitarian Catastrophes
Directory of Open Access Journals (Sweden)
Kenneth V. Iserson
2014-03-01
Full Text Available “Humanitarian catastrophes,” conflicts and calamities generating both widespread human suffering and destructive events, require a wide range of emergency resources. This paper answers a number of questions that humanitarian catastrophes generate: Why and how do the most-developed countries—those with the resources, capabilities, and willingness to help—intervene in specific types of disasters? What ethical and legal guidelines shape our interventions? How well do we achieve our goals? It then suggests a number of changes to improve humanitarian responses, including better NGO-government cooperation, increased research on the best disaster response methods, clarification of the criteria and roles for humanitarian (military interventions, and development of post-2015 Millennium Development Goals with more accurate progress measures. [West J Emerg Med. 2014;15(2:231–240.
Intestinal malrotation and catastrophic volvulus in infancy.
Lee, Henry Chong; Pickard, Sarah S; Sridhar, Sunita; Dutta, Sanjeev
2012-07-01
Intestinal malrotation in the newborn is usually diagnosed after signs of intestinal obstruction, such as bilious emesis, and corrected with the Ladd procedure. The objective of this report is to describe the presentation of severe cases of midgut volvulus presenting in infancy, and to discuss the characteristics of these cases. We performed a 7-year review at our institution and present two cases of catastrophic midgut volvulus presenting in the post-neonatal period, ending in death soon after the onset of symptoms. These two patients also had significant laboratory abnormalities compared to patients with more typical presentations resulting in favorable outcomes. Although most cases of intestinal malrotation in infancy can be treated successfully, in some circumstances, patients' symptoms may not be detected early enough for effective treatment, and therefore may result in catastrophic midgut volvulus and death. Copyright © 2012 Elsevier Inc. All rights reserved.
Dafonte, C.; Fustes, D.; Manteiga, M.; Garabato, D.; Álvarez, M. A.; Ulla, A.; Allende Prieto, C.
2016-10-01
Aims: We present an innovative artificial neural network (ANN) architecture, called Generative ANN (GANN), that computes the forward model, that is it learns the function that relates the unknown outputs (stellar atmospheric parameters, in this case) to the given inputs (spectra). Such a model can be integrated in a Bayesian framework to estimate the posterior distribution of the outputs. Methods: The architecture of the GANN follows the same scheme as a normal ANN, but with the inputs and outputs inverted. We train the network with the set of atmospheric parameters (Teff, log g, [Fe/H] and [α/ Fe]), obtaining the stellar spectra for such inputs. The residuals between the spectra in the grid and the estimated spectra are minimized using a validation dataset to keep solutions as general as possible. Results: The performance of both conventional ANNs and GANNs to estimate the stellar parameters as a function of the star brightness is presented and compared for different Galactic populations. GANNs provide significantly improved parameterizations for early and intermediate spectral types with rich and intermediate metallicities. The behaviour of both algorithms is very similar for our sample of late-type stars, obtaining residuals in the derivation of [Fe/H] and [α/ Fe] below 0.1 dex for stars with Gaia magnitude Grvs satellite. Conclusions: Uncertainty estimation of computed astrophysical parameters is crucial for the validation of the parameterization itself and for the subsequent exploitation by the astronomical community. GANNs produce not only the parameters for a given spectrum, but a goodness-of-fit between the observed spectrum and the predicted one for a given set of parameters. Moreover, they allow us to obtain the full posterior distribution over the astrophysical parameters space once a noise model is assumed. This can be used for novelty detection and quality assessment.
A critical look at catastrophe risk assessments
Kent, A
2004-01-01
Recent papers by Busza et al. (BJSW) and Dar et al. (DDH) argue that astrophysical data can be used to establish bounds on the risk of a catastrophe in forthcoming collider experiments. The safety case set out by BJSW does not rely on these bounds, but on theoretical arguments, which BJSW find sufficiently compelling. However, DDH and other commentators (initially including BJSW) have suggested that the astrophysical bounds alone do give sufficient reassurance. This seems unsupportable when the bounds are expressed in terms of expected cost. For example, DDH's main bound, $p_{\\rm catastrophe} < 2 \\times 10^{-8}$, implies only that the expectation value of the number of deaths is bounded by 120. We thus reappraise the DDH and BJSW risk bounds by comparing risk policy in other areas. We find that requiring a catastrophe risk of no higher than 10^{-15} is necessary to be consistent with established policy for risk optimisation from radiation hazards, even if highly risk tolerant assumptions are made. A respec...
Catastrophic antiphospholipid syndrome and pregnancy. Clinical report.
Khizroeva, J; Bitsadze, V; Makatsariya, A
2018-01-08
We have observed the development of a catastrophic antiphospholipid syndrome (CAPS) in a pregnant woman hospitalized at 28 weeks of gestation with a severe preeclampsia. On the same day, an eclampsia attack developed, and an emergency surgical delivery was performed. On the third day, multiorgan failure developed. Examination showed a persistent circulation of lupus anticoagulant, high level of antibodies to cardiolipin, b2-glycoprotein I, and prothrombin. The usual diagnosis of the severe preeclampsia masked a catastrophic antiphospholipid syndrome, exacerbated by the coincident presence of several types of antiphospholipid antibodies. The first pregnancy resulted in a premature birth at 25 weeks, possibly also due to the circulation of antiphospholipid antibodies. The trigger of the catastrophic form development was the pregnancy itself, surgical intervention, and hyperhomocysteinemia. CAPS is the most severe form of antiphospholipid syndrome, manifested in multiple microthrombosis of microcirculation of vital organs and in the development of multiorgan failure against the background of the high level of antiphospholipid antibodies. CAPS is characterized by renal, cerebral, gastrointestinal, adrenal, ovarian, skin, and other forms of microthrombosis. Thrombosis recurrence is typical. Thrombotic microvasculopathy lies at the heart of multiorgan failure and manifests clinically in central nervous system lesions, adrenal insufficiency, and ARDS development. CAPS is a life-threatening condition, therefore, requires an urgent treatment. Optimal treatment of CAPS is not developed. CAPS represent a general medical multidisciplinary problem.
Liu, Baoding
2015-01-01
When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...
Plósz, Benedek Gy; De Clercq, Jeriffa; Nopens, Ingmar; Benedetti, Lorenzo; Vanrolleghem, Peter A
2011-01-01
In WWTP models, the accurate assessment of solids inventory in bioreactors equipped with solid-liquid separators, mostly described using one-dimensional (1-D) secondary settling tank (SST) models, is the most fundamental requirement of any calibration procedure. Scientific knowledge on characterising particulate organics in wastewater and on bacteria growth is well-established, whereas 1-D SST models and their impact on biomass concentration predictions are still poorly understood. A rigorous assessment of two 1-DSST models is thus presented: one based on hyperbolic (the widely used Takács-model) and one based on parabolic (the more recently presented Plósz-model) partial differential equations. The former model, using numerical approximation to yield realistic behaviour, is currently the most widely used by wastewater treatment process modellers. The latter is a convection-dispersion model that is solved in a numerically sound way. First, the explicit dispersion in the convection-dispersion model and the numerical dispersion for both SST models are calculated. Second, simulation results of effluent suspended solids concentration (XTSS,Eff), sludge recirculation stream (XTSS,RAS) and sludge blanket height (SBH) are used to demonstrate the distinct behaviour of the models. A thorough scenario analysis is carried out using SST feed flow rate, solids concentration, and overflow rate as degrees of freedom, spanning a broad loading spectrum. A comparison between the measurements and the simulation results demonstrates a considerably improved 1-D model realism using the convection-dispersion model in terms of SBH, XTSS,RAS and XTSS,Eff. Third, to assess the propagation of uncertainty derived from settler model structure to the biokinetic model, the impact of the SST model as sub-model in a plant-wide model on the general model performance is evaluated. A long-term simulation of a bulking event is conducted that spans temperature evolution throughout a summer
Catastrophe theory and its application status in mechanical engineering
Directory of Open Access Journals (Sweden)
Jinge LIU
Full Text Available Catastrophe theory is a kind of mathematical method which aims to apply and interpret the discontinuous phenomenon. Since its emergence, it has been widely used to explain a variety of emergent phenomena in the fields of natural science, social science, management science and some other science and technology fields. Firstly, this paper introduces the theory of catastrophe in several aspects, such as its generation, radical principle, basic characteristics and development. Secondly, it summarizes the main applications of catastrophe theory in the field of mechanical engineering, focusing on the research progress of catastrophe theory in revealing catastrophe of rotor vibration state, analyzing friction and wear failure, predicting metal fracture, and so on. Finally, it advises that later development of catastrophe theory should pay more attention to the combination of itself with other traditional nonlinear theories and methods. This paper provides a beneficial reference to guide the application of catastrophe theory in mechanical engineering and related fields for later research.
International Nuclear Information System (INIS)
Chauliac, Christian; Bestion, Dominique; Crouzet, Nicolas; Aragones, Jose-Maria; Cacuci, Dan Gabriel; Weiss, Frank-Peter; Zimmermann, Martin A.
2010-01-01
The NURESIM project, the numerical simulation platform, is developed in the frame of the NURISP European Collaborative Project (FP7), which includes 22 organizations from 14 European countries. NURESIM intends to be a reference platform providing high quality software tools, physical models, generic functions and assessment results. The NURESIM platform provides an accurate representation of the physical phenomena by promoting and incorporating the latest advances in core physics, two-phase thermal-hydraulics and fuel modelling. It includes multi-scale and multi-physics features, especially for coupling core physics and thermal-hydraulics models for reactor safety. Easy coupling of the different codes and solvers is provided through the use of a common data structure and generic functions (e.g., for interpolation between non-conforming meshes). More generally, the platform includes generic pre-processing, post-processing and supervision functions through the open-source SALOME software, in order to make the codes more user-friendly. The platform also provides the informatics environment for testing and comparing different codes. The contribution summarizes the achievements and ongoing developments of the simulation platform in core physics, thermal-hydraulics, multi-physics, uncertainties and code integration
Nevison, C. D.; Saikawa, E.; Dlugokencky, E. J.; Andrews, A. E.; Sweeney, C.
2014-12-01
Atmospheric N2O concentrations have increased from 275 ppb in the preindustrial to about 325 ppb in recent years, a ~20% increase with important implications for both anthropogenic greenhouse forcing and stratospheric ozone recovery. This increase has been driven largely by synthetic fertilizer production and other perturbations to the global nitrogen cycle associated with human agriculture. Several recent regional atmospheric inversion studies have quantified North American agricultural N2O emissions using top-down constraints based on atmospheric N2O data from the National Oceanic and Atmospheric Administration (NOAA) Global Greenhouse Gas Reference Network, including surface, aircraft and tall tower platforms. These studies have concluded that global N2O inventories such as EDGAR may be underestimating the true U.S. anthropogenic N2O source by a factor of 3 or more. However, simple back-of-the-envelope calculations show that emissions of this magnitude are difficult to reconcile with the basic constraints of the global N2O budget. Here, we explore some possible reasons why regional atmospheric inversions might overestimate the U.S. agricultural N2O source. First, the seasonality of N2O agricultural sources is not well known, but can have an important influence on inversion results, particularly when the inversions are based on data that are concentrated in the spring/summer growing season. Second, boundary conditions can strongly influence regional inversions but the boundary conditions used may not adequately account for remote influences on surface data such as the seasonal stratospheric influx of N2O-depleted air. We will present a set of forward model simulations, using the Community Land Model (CLM) and two atmospheric chemistry tracer transport models, MOZART and the Whole Atmosphere Community Climate Model (WACCM), that examine the influence of terrestrial emissions and atmospheric chemistry and dynamics on atmospheric variability in N2O at U.S. and
A unified approach of catastrophic events
Directory of Open Access Journals (Sweden)
S. Nikolopoulos
2004-01-01
Full Text Available Although there is an accumulated charge of theoretical, computational, and numerical work, like catastrophe theory, bifurcation theory, stochastic and deterministic chaos theory, there is an important feeling that these matters do not completely cover the physics of real catastrophic events. Recent studies have suggested that a large variety of complex processes, including earthquakes, heartbeats, and neuronal dynamics, exhibits statistical similarities. Here we are studying in terms of complexity and non linear techniques whether isomorphic signatures emerged indicating the transition from the normal state to the both geological and biological shocks. In the last 15 years, the study of Complex Systems has emerged as a recognized field in its own right, although a good definition of what a complex system is, actually is eluded. A basic reason for our interest in complexity is the striking similarity in behaviour close to irreversible phase transitions among systems that are otherwise quite different in nature. It is by now recognized that the pre-seismic electromagnetic time-series contain valuable information about the earthquake preparation process, which cannot be extracted without the use of important computational power, probably in connection with computer Algebra techniques. This paper presents an analysis, the aim of which is to indicate the approach of the global instability in the pre-focal area. Non-linear characteristics are studied by applying two techniques, namely the Correlation Dimension Estimation and the Approximate Entropy. These two non-linear techniques present coherent conclusions, and could cooperate with an independent fractal spectral analysis to provide a detection concerning the emergence of the nucleation phase of the impending catastrophic event. In the context of similar mathematical background, it would be interesting to augment this description of pre-seismic electromagnetic anomalies in order to cover biological
Timber Price Dynamics Following a Natural Catastrophe
Jeffrey P. Prestemon; Thomas P. Holmes
2000-01-01
Catastrophic shocks to existing stocks of a renewable resource can cause long-run price shifts. With timber, these long-run price shifts may be accompanied by a short-run price drop due to salvage. Hurricane Hugo damaged 20% of southern pine timber in the South Carolina Coastal Plain in 1989. To estimate the short- and long-run effects of the hurricane on the prices of timber stocks, we estimated an intervention model of the residuals of cointegration of South Carolina sawtimber and pulpwood ...
Astrophysics: is a doomsday catastrophe likely?
Tegmark, Max; Bostrom, Nick
2005-12-08
The risk of a doomsday scenario in which high-energy physics experiments trigger the destruction of the Earth has been estimated to be minuscule. But this may give a false sense of security: the fact that the Earth has survived for so long does not necessarily mean that such disasters are unlikely, because observers are, by definition, in places that have avoided destruction. Here we derive a new upper bound of one per billion years (99.9% confidence level) for the exogenous terminal-catastrophe rate that is free of such selection bias, using calculations based on the relatively late formation time of Earth.
Astrophysics: Is a doomsday catastrophe likely?
Tegmark, Max; Bostrom, Nick
2005-12-01
The risk of a doomsday scenario in which high-energy physics experiments trigger the destruction of the Earth has been estimated to be minuscule. But this may give a false sense of security: the fact that the Earth has survived for so long does not necessarily mean that such disasters are unlikely, because observers are, by definition, in places that have avoided destruction. Here we derive a new upper bound of one per billion years (99.9% confidence level) for the exogenous terminal-catastrophe rate that is free of such selection bias, using calculations based on the relatively late formation time of Earth.
Adjoint-Based Uncertainty Quantification with MCNP
Energy Technology Data Exchange (ETDEWEB)
Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)
2011-09-01
This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.
Duerdoth, Ian
2009-01-01
The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…
DEFF Research Database (Denmark)
Heydorn, Kaj; Anglov, Thomas
2002-01-01
Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...
Multiple Sclerosis and Catastrophic Health Expenditure in Iran.
Juyani, Yaser; Hamedi, Dorsa; Hosseini Jebeli, Seyede Sedighe; Qasham, Maryam
2016-09-01
There are many disabling medical conditions which can result in catastrophic health expenditure. Multiple Sclerosis is one of the most costly medical conditions through the world which encounter families to the catastrophic health expenditures. This study aims to investigate on what extent Multiple sclerosis patients face catastrophic costs. This study was carried out in Ahvaz, Iran (2014). The study population included households that at least one of their members suffers from MS. To analyze data, Logit regression model was employed by using the default software STATA12. 3.37% of families were encountered with catastrophic costs. Important variables including brand of drug, housing, income and health insurance were significantly correlated with catastrophic expenditure. This study suggests that although a small proportion of MS patients met the catastrophic health expenditure, mechanisms that pool risk and cost (e.g. health insurance) are required to protect them and improve financial and access equity in health care.
Inside money, procyclical leverage, and banking catastrophes.
Brummitt, Charles D; Sethi, Rajiv; Watts, Duncan J
2014-01-01
We explore a model of the interaction between banks and outside investors in which the ability of banks to issue inside money (short-term liabilities believed to be convertible into currency at par) can generate a collapse in asset prices and widespread bank insolvency. The banks and investors share a common belief about the future value of certain long-term assets, but they have different objective functions; changes to this common belief result in portfolio adjustments and trade. Positive belief shocks induce banks to buy risky assets from investors, and the banks finance those purchases by issuing new short-term liabilities. Negative belief shocks induce banks to sell assets in order to reduce their chance of insolvency to a tolerably low level, and they supply more assets at lower prices, which can result in multiple market-clearing prices. A sufficiently severe negative shock causes the set of equilibrium prices to contract (in a manner given by a cusp catastrophe), causing prices to plummet discontinuously and banks to become insolvent. Successive positive and negative shocks of equal magnitude do not cancel; rather, a banking catastrophe can occur even if beliefs simply return to their initial state. Capital requirements can prevent crises by curtailing the expansion of balance sheets when beliefs become more optimistic, but they can also force larger price declines. Emergency asset price supports can be understood as attempts by a central bank to coordinate expectations on an equilibrium with solvency.
Inside money, procyclical leverage, and banking catastrophes.
Directory of Open Access Journals (Sweden)
Charles D Brummitt
Full Text Available We explore a model of the interaction between banks and outside investors in which the ability of banks to issue inside money (short-term liabilities believed to be convertible into currency at par can generate a collapse in asset prices and widespread bank insolvency. The banks and investors share a common belief about the future value of certain long-term assets, but they have different objective functions; changes to this common belief result in portfolio adjustments and trade. Positive belief shocks induce banks to buy risky assets from investors, and the banks finance those purchases by issuing new short-term liabilities. Negative belief shocks induce banks to sell assets in order to reduce their chance of insolvency to a tolerably low level, and they supply more assets at lower prices, which can result in multiple market-clearing prices. A sufficiently severe negative shock causes the set of equilibrium prices to contract (in a manner given by a cusp catastrophe, causing prices to plummet discontinuously and banks to become insolvent. Successive positive and negative shocks of equal magnitude do not cancel; rather, a banking catastrophe can occur even if beliefs simply return to their initial state. Capital requirements can prevent crises by curtailing the expansion of balance sheets when beliefs become more optimistic, but they can also force larger price declines. Emergency asset price supports can be understood as attempts by a central bank to coordinate expectations on an equilibrium with solvency.
Death, Catastrophe, and the Significance of Tragedy
Directory of Open Access Journals (Sweden)
Jennifer Ballengee
2014-05-01
Full Text Available This NANO note will examine the tension between representation, memorial, and the catastrophe of death that emerges in the space of tragedy, as the problem arises in two quite different works: Oedipus at Colonus, a fairly typical fifth-century Greek tragedy, and Falling Man, Don DeLillo’s novel that, in its attempt to address the events of 9/11, reflects in form and subject matter many of Aristotle’s terms of tragic representation. It is not the intent of this note to engage with the recent proliferation of work in “performance theory.” Rather than being concerned with an imagined exchange between audience and actor, this study examines how the supplementary relationship of gesture and speech in tragedy disrupts the public/private distinction, and how this articulation effects and enables the public memorialization of death. Thus, this paper will consider the representation of death as an event whose catastrophic, and somewhat mysterious, collision of the public and the private lends it its tragic significance.
Modeling workplace bullying using catastrophe theory.
Escartin, J; Ceja, L; Navarro, J; Zapf, D
2013-10-01
Workplace bullying is defined as negative behaviors directed at organizational members or their work context that occur regularly and repeatedly over a period of time. Employees' perceptions of psychosocial safety climate, workplace bullying victimization, and workplace bullying perpetration were assessed within a sample of nearly 5,000 workers. Linear and nonlinear approaches were applied in order to model both continuous and sudden changes in workplace bullying. More specifically, the present study examines whether a nonlinear dynamical systems model (i.e., a cusp catastrophe model) is superior to the linear combination of variables for predicting the effect of psychosocial safety climate and workplace bullying victimization on workplace bullying perpetration. According to the AICc, and BIC indices, the linear regression model fits the data better than the cusp catastrophe model. The study concludes that some phenomena, especially unhealthy behaviors at work (like workplace bullying), may be better studied using linear approaches as opposed to nonlinear dynamical systems models. This can be explained through the healthy variability hypothesis, which argues that positive organizational behavior is likely to present nonlinear behavior, while a decrease in such variability may indicate the occurrence of negative behaviors at work.
Uncertainty quantification in resonance absorption
International Nuclear Information System (INIS)
Williams, M.M.R.
2012-01-01
We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.
Indirect Catastrophic Injuries in Olympic Styles of Wrestling in Iran
Kordi, Ramin; Ziaee, Vahid; Rostami, Mohsen; Wallace, W. Angus
2011-01-01
Background: Data on indirect catastrophic injuries in wrestling are scarce. Objectives: To develop a profile of indirect catastrophic injuries in international styles of wrestling and to describe possible risk factors. Study Design: Retrospective case series; Level of evidence, 3. Methods: Indirect catastrophic injuries that occurred in wrestling clubs in Iran from July 1998 to June 2005 were identified by contacting several sources. The cases were retrospectively reviewed. Results: The injur...
Coronal Flux Rope Catastrophe Associated With Internal Energy Release
Zhuang, Bin; Hu, Youqiu; Wang, Yuming; Zhang, Quanhao; Liu, Rui; Gou, Tingyu; Shen, Chenglong
2018-04-01
Magnetic energy during the catastrophe was predominantly studied by the previous catastrophe works since it is believed to be the main energy supplier for the solar eruptions. However, the contribution of other types of energies during the catastrophe cannot be neglected. This paper studies the catastrophe of the coronal flux rope system in the solar wind background, with emphasis on the transformation of different types of energies during the catastrophe. The coronal flux rope is characterized by its axial and poloidal magnetic fluxes and total mass. It is shown that a catastrophe can be triggered by not only an increase but also a decrease of the axial magnetic flux. Moreover, the internal energy of the rope is found to be released during the catastrophe so as to provide energy for the upward eruption of the flux rope. As far as the magnetic energy is concerned, it provides only part of the energy release, or even increases during the catastrophe, so the internal energy may act as the dominant or even the unique energy supplier during the catastrophe.
GDP-to-GTP exchange on the microtubule end can contribute to the frequency of catastrophe.
Piedra, Felipe-Andrés; Kim, Tae; Garza, Emily S; Geyer, Elisabeth A; Burns, Alexander; Ye, Xuecheng; Rice, Luke M
2016-11-07
Microtubules are dynamic polymers of αβ-tubulin that have essential roles in chromosome segregation and organization of the cytoplasm. Catastrophe-the switch from growing to shrinking-occurs when a microtubule loses its stabilizing GTP cap. Recent evidence indicates that the nucleotide on the microtubule end controls how tightly an incoming subunit will be bound (trans-acting GTP), but most current models do not incorporate this information. We implemented trans-acting GTP into a computational model for microtubule dynamics. In simulations, growing microtubules often exposed terminal GDP-bound subunits without undergoing catastrophe. Transient GDP exposure on the growing plus end slowed elongation by reducing the number of favorable binding sites on the microtubule end. Slower elongation led to erosion of the GTP cap and an increase in the frequency of catastrophe. Allowing GDP-to-GTP exchange on terminal subunits in simulations mitigated these effects. Using mutant αβ-tubulin or modified GTP, we showed experimentally that a more readily exchangeable nucleotide led to less frequent catastrophe. Current models for microtubule dynamics do not account for GDP-to-GTP exchange on the growing microtubule end, so our findings provide a new way of thinking about the molecular events that initiate catastrophe. © 2016 Piedra et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Li, Yue; Yang, Hui; Wang, Tao; MacBean, Natasha; Bacour, Cédric; Ciais, Philippe; Zhang, Yiping; Zhou, Guangsheng; Piao, Shilong
2017-08-01
Reducing parameter uncertainty of process-based terrestrial ecosystem models (TEMs) is one of the primary targets for accurately estimating carbon budgets and predicting ecosystem responses to climate change. However, parameters in TEMs are rarely constrained by observations from Chinese forest ecosystems, which are important carbon sink over the northern hemispheric land. In this study, eddy covariance data from six forest sites in China are used to optimize parameters of the ORganizing Carbon and Hydrology In Dynamics EcosystEms TEM. The model-data assimilation through parameter optimization largely reduces the prior model errors and improves the simulated seasonal cycle and summer diurnal cycle of net ecosystem exchange, latent heat fluxes, and gross primary production and ecosystem respiration. Climate change experiments based on the optimized model are deployed to indicate that forest net primary production (NPP) is suppressed in response to warming in the southern China but stimulated in the northeastern China. Altered precipitation has an asymmetric impact on forest NPP at sites in water-limited regions, with the optimization-induced reduction in response of NPP to precipitation decline being as large as 61% at a deciduous broadleaf forest site. We find that seasonal optimization alters forest carbon cycle responses to environmental change, with the parameter optimization consistently reducing the simulated positive response of heterotrophic respiration to warming. Evaluations from independent observations suggest that improving model structure still matters most for long-term carbon stock and its changes, in particular, nutrient- and age-related changes of photosynthetic rates, carbon allocation, and tree mortality.
DEFF Research Database (Denmark)
Nguyen, Daniel Xuyen
This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...
Madame Bovary and Catastrophism: Revolving narratives
Directory of Open Access Journals (Sweden)
Ruth Morris
2011-07-01
Full Text Available Cet article relie Madame Bovary au contexte scientifique français des années 1850, en lisant le roman de Flaubert à la lumière des théories de Cuvier. Le savant français Georges Cuvier, avec nombre de ses contemporains, explique les origines du monde à l’aide de la théorie des catastrophes. D’après cette théorie, le monde est divisé en périodes très courtes ponctuées de grandes catastrophes ou, en termes cuviériens, de « révolutions » qui ont éradiqué toute vie et ont permis au monde d’être entièrement repeuplé. Une telle conception affecte l’idée même du « temps ». Cuvier pense que la formation de la Terre est relativement récente, l’époque présente n’étant vieille que de cinq mille ans. Cette compression temporelle peut être rapportée à Madame Bovary dont le « tempo » s’accroît au fur et à mesure qu’on se rapproche du dénouement. Dans la théorie des catastrophes comme dans le roman, le temps ne suit pas une ligne chronologique. Les « révolutions » viennent briser le fil continu du temps et Emma est souvent incapable de distinguer entre le passé, le présent et le futur. Les « révolutions » servent aussi à ponctuer et à perturber le cours de la vie sur Terre en produisant des événements majeurs dans l’histoire du globe. Il en est de même dans la vie d’Emma. Son existence est marquée par des événements majeurs, comme le bal, qui créent un éclatement et une fragmentation de la temporalité, comme dans la théorie de Cuvier. Je défendrai aussi l’idée d’un lien entre la soudaineté et la violence des « révolutions » et les crises nerveuses d’Emma, qui surviennent brusquement et relèvent de l’hystérie. La conception cuviérienne de la temporalité doit enfin être envisagée au regard des théories de l’évolution, ce qui implique de réévaluer les notions d’adaptation, d’hérédité et de mort dans le roman de Flaubert.This paper locates Madame
Uncertainty, Stress and Decision Simulation
National Research Council Canada - National Science Library
Warwick, Walter
2000-01-01
.... Rather than continue in the tradition of rational choice theories and rule-based expert systems, we took a novel approach to this research and began work on a model of Recognition Primed Decision making (RPD...
Oxidized zirconium on ceramic; Catastrophic coupling.
Ozden, V E; Saglam, N; Dikmen, G; Tozun, I R
2017-02-01
Oxidized zirconium (Oxinium™; Smith & Nephew, Memphis, TN, USA) articulated with polyethylene in total hip arthroplasty (THA) appeared to have the potential to reduce wear dramatically. The thermally oxidized metal zirconium surface is transformed into ceramic-like hard surface that is resistant to abrasion. The exposure of soft zirconium metal under hard coverage surface after the damage of oxidized zirconium femoral head has been described. It occurred following joint dislocation or in situ succeeding disengagement of polyethylene liner. We reported three cases of misuse of Oxinium™ (Smith & Nephew, Memphis, TN, USA) heads. These three cases resulted in catastrophic in situ wear and inevitable failure although there was no advice, indication or recommendation for this use from the manufacturer. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Metal Dusting: Catastrophic Corrosion by Carbon
Young, David J.; Zhang, Jianqiang
2012-12-01
Reducing gases rich in carbon-bearing species such as CO can be supersaturated with respect to graphite at intermediate temperatures of about 400-700°C. Engineering alloys such as low-alloy and stainless steels, and heat-resisting iron-, nickel-, and cobalt-base alloys catalyze gas processes that release the carbon. An understanding of how the resulting carbon deposition can destroy alloys at a catastrophically rapid rate has been the objective of a great deal of research. The current review of recent work on metal dusting covers the mass transfer—principally carbon diffusion—and graphite nucleation processes involved. A clear distinction emerges between ferritic alloys, which form cementite and precipitate graphite within that carbide, and austenitics that nucleate graphite directly within the metal. The latter process is facilitated by the strong orientation relationship between the graphite and face-centered cubic (fcc) lattices. Strategies for the control of dusting are briefly outlined.
Application of catastrophe theory to nuclear structure
International Nuclear Information System (INIS)
Scharff-Goldhaber, G.; Dresden, M.
1979-01-01
Three two-parameter models, one describing an A-body system (the atomic nucleus) and two describing many-body systems (the van der Waals gas and the ferroelectric (perovskite) system) are compared within the framework of catastrophe theory. It is shown that each has a critical point (second-order phase transition) when the two counteracting forces controlling it are in balance; further, each undergoes a first-order phase transition when one of the forces vanishes (the deforming force for the nucleus, the attractive force for the van der Waals gas, and the dielectric constant for the perovskite). Finally, when both parameters are kept constant, a kind of phase transition may occur at a critical angular momentum, critical pressure, and critical electric field. 3 figures, 1 table
Spiral arms, comets and terrestrial catastrophism
International Nuclear Information System (INIS)
Clube, S.V.M.; Napier, W.M.
1982-01-01
A review is presented of an hypothesis of terrestrial catastrophism in which comets grow in molecular clouds and are captured by the Sun as it passes through the spiral arms of the Galaxy. Assuming that comets are a major supplier of the Earth-crossing (Appollo) asteroid population, the latter fluctuates correspondingly and leads to episodes of terrestrial bombardment. Changes in the rotational momentum of core and mantle, generated by impacts, lead to episodes of magnetic field reversal and tectonic activity, while surface phenomena lead to ice-ages and mass extinctions. An episodic geophysical history with an interstellar connection is thus implied. If comets in spiral arms are necessary intermediaries in the process of star formation, the theory also has implications relating to early solar system history and galactic chemistry. These aspects are briefly discussed with special reference to the nature of spiral arms. (author)
Uncertainty, joint uncertainty, and the quantum uncertainty principle
International Nuclear Information System (INIS)
Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad
2016-01-01
Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)
Determinants of catastrophic health expenditure in iran.
Abolhallaje, M; Hasani, Sa; Bastani, P; Ramezanian, M; Kazemian, M
2013-01-01
This study will provide detailed specification of those variables and determinants of unpredictable health expenditure in Iran, and the requirements to reduce extensive effects of the factors affecting households' payments for health and other goods and services inappropriately. This study aims to identify measures of fair financing of health services and determinants of fair financing contribution, regarding the required share of households that prevents their catastrophic payments. In this regard, analysis of shares of households' expenditures on main groups of goods and services in urban and rural areas and in groups of deciles in the statistics from households' expenditure surveys was applied. The growth of spending in nominal values within the years 2002-2008 was considerably high and the rate for out-of-pocket payments is nearly the same or greater than the rate for total health expenditure. In 2008, urban and rural households in average pay 6.4% and 6.35% of their total expenditure on health services. Finally three categories of determinants of unfair and catastrophic payments by households were recognized in terms of households' socio-economic status, equality/inequality conditions of the distribution of risk of financing, and economic aspects of health expenditure distribution. While extending the total share of government and prepayment sources of financing health services are considered as the simplest policy for limiting out-of-pocket payments, indicators and policies introduced in this study could also be considered important and useful for the development of health sector and easing access to health services, irrespective of health financing fairness.
Grasshopper Population Ecology: Catastrophe, Criticality, and Critique
Directory of Open Access Journals (Sweden)
Dale R. Lockwood
2008-06-01
Full Text Available Grasshopper population dynamics are an important part of the North American rangeland ecosystem and an important factor in the economies that derive from the rangeland. Outbreak dynamics have plagued management strategies in the rangeland, and attempts to find simple, linear and mechanistic solutions to both understanding and predicting the dynamics have proved fruitless. These efforts to ground theory in a correspondence with the "real" world, including whether the population dynamics are ultimately density dependent or density independent, have generated abundant heat but little light. We suggest that a pragmatic approach, in which theories are taken to be "tools" rather than competing claims of truth, has greater promise to move ecological research in a constructive direction. Two recent non-linear approaches exploiting the tools of complexity science provide insights relevant to explaining and forecasting population dynamics. Observation and data collection were used to structure models derived from catastrophe theory and self-organized criticality. These models indicate that nonlinear processes are important in the dynamics of the outbreaks. And the conceptual structures of these approaches provide clear, albeit constrained or contingent, implications for pest managers. We show that, although these two frameworks, catastrophe theory and self-organized criticality, are very different, the frequency distributions of time series from both systems result in power law relationships. Further, we show that a simple lattice-based model, similar to SOC but structured on the biology of the grasshoppers gives a spatial time series similar to data over a 50-year span and the frequency distribution is also a power law relationship. This demonstration exemplifies how a "both-and" rather than an "either-or" approach to ecological modeling, in which the useful elements of particular theories or conceptual structures are extracted, may provide a way forward
Purchase of Catastrophe Insurance by Dutch Dairy and Arable Farmers
Ogurtsov, V.; Asseldonk, van M.A.P.M.; Huirne, R.B.M.
2009-01-01
This article analyzed the impact of risk perception, risk attitude, and other farmer personal and farm characteristics on the actual purchase of catastrophe insurance by Dutch dairy and arable farmers. The specific catastrophe insurance types considered were hail–fire–storm insurance for buildings,
'Performative narrativity': Palestinian identity and the performance of catastrophe
Saloul, I.
2008-01-01
The day Israel annually celebrates as its "Day of Independence" Palestinians commemorate as their day of catastrophe (al-nakba). To most Palestinians, the catastrophic loss of Palestine in 1948 represents the climactic formative event of their lives. In the aftermath of this loss, the Palestinian
Catastrophe, Gender and Urban Experience, 1648–1920
DEFF Research Database (Denmark)
Employing a broad definition of catastrophe, this book examines how urban communities conceived, adapted to and were transformed by catastrophes. Competing views of gender figure in the telling and retelling of these trag- edies, which are mediated by myth and memory. This is a nuanced account...
Energy Technology Data Exchange (ETDEWEB)
Mallet, V.
2005-12-15
The aim of this work is the evaluation of the quality of a chemistry-transport model, not by a classical comparison with observations, but by the estimation of its uncertainties due to the input data, to the model formulation and to the numerical approximations. The study of these 3 sources of uncertainty is carried out with Monte Carlo simulations, with multi-model simulations and with comparisons between numerical schemes, respectively. A high uncertainty is shown for ozone concentrations. To overcome the uncertainty-related limitations, a strategy consists in using the overall forecasting. By combining several models (up to 48) on the basis of past observations, forecasts can be significantly improved. This work has been also the occasion of developing an innovative modeling system, named Polyphemus. (J.S.)
Energy Technology Data Exchange (ETDEWEB)
Mallet, V
2005-12-15
The aim of this work is the evaluation of the quality of a chemistry-transport model, not by a classical comparison with observations, but by the estimation of its uncertainties due to the input data, to the model formulation and to the numerical approximations. The study of these 3 sources of uncertainty is carried out with Monte Carlo simulations, with multi-model simulations and with comparisons between numerical schemes, respectively. A high uncertainty is shown for ozone concentrations. To overcome the uncertainty-related limitations, a strategy consists in using the overall forecasting. By combining several models (up to 48) on the basis of past observations, forecasts can be significantly improved. This work has been also the occasion of developing an innovative modeling system, named Polyphemus. (J.S.)
Well below 2 °C: Mitigation strategies for avoiding dangerous to catastrophic climate changes
Xu, Yangyang; Ramanathan, Veerabhadran
2017-09-01
The historic Paris Agreement calls for limiting global temperature rise to “well below 2 °C.” Because of uncertainties in emission scenarios, climate, and carbon cycle feedback, we interpret the Paris Agreement in terms of three climate risk categories and bring in considerations of low-probability (5%) high-impact (LPHI) warming in addition to the central (˜50% probability) value. The current risk category of dangerous warming is extended to more categories, which are defined by us here as follows: >1.5 °C as dangerous; >3 °C as catastrophic; and >5 °C as unknown, implying beyond catastrophic, including existential threats. With unchecked emissions, the central warming can reach the dangerous level within three decades, with the LPHI warming becoming catastrophic by 2050. We outline a three-lever strategy to limit the central warming below the dangerous level and the LPHI below the catastrophic level, both in the near term (pollutant (SP) lever to mitigate short-lived climate pollutants, and the carbon extraction and sequestration (CES) lever to thin the atmospheric CO2 blanket. Pulling on both CN and SP levers and bending the emissions curve by 2020 can keep the central warming below dangerous levels. To limit the LPHI warming below dangerous levels, the CES lever must be pulled as well to extract as much as 1 trillion tons of CO2 before 2100 to both limit the preindustrial to 2100 cumulative net CO2 emissions to 2.2 trillion tons and bend the warming curve to a cooling trend.
1.5 °C ? - Solutions for avoiding catastrophic climate change in this century
Xu, Y.
2017-12-01
The historic Paris Agreement calls for limiting global temperature rise to "well below 2 °C." Because of uncertainties in emission scenarios, climate, and carbon cycle feedback, we interpret the Paris Agreement in terms of three climate risk categories and bring in considerations of low-probability (5%) high impact (LPHI) warming in addition to the central (˜50% probability) value. The current risk category of dangerous warming is extended to more categories, which are defined by us here as follows: >1.5 °C as dangerous; >3 °C as catastrophic; and >5 °C as unknown, implying beyond catastrophic, including existential threats. With unchecked emissions, the central warming can reach the dangerous level within three decades, with the LPHI warming becoming catastrophic by 2050. We outline a three-lever strategy to limit the central warming below the dangerous level and the LPHI below the catastrophic level, both in the near term (pollutant (SP) lever to mitigate short-lived climate pollutants, and the carbon extraction and sequestration (CES) lever to thin the atmospheric CO2 blanket. Pulling on both CN and SP levers and bending the emissions curve by 2020 can keep the central warming below dangerous levels. To limit the LPHI warming below dangerous levels, the CES lever must be pulled as well to extract as much as 1 trillion tons of CO2 before 2100 to both limit the preindustrial to 2100 cumulative net CO2 emissions to 2.2 trillion tons and bend the warming curve to a cooling trend. In addition to present the analysis above, I will also share (1) perspective on developed and developing world actions and interactions on climate solutions; (2) Prof V. Ramanathan's interactions with the Pontifical Academy of Sciences and other religious groups which are highly valuable to the interdisciplinary audience.
International Nuclear Information System (INIS)
Thomas, R.E.
1982-03-01
An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software
Nuclear catastrophe in Japan. Health consequences resulting from Fukushima
International Nuclear Information System (INIS)
Paulitz, Henrik; Eisenberg, Winfrid; Thiel, Reinhold
2013-01-01
external radiation exposure would amount to between 37,899 and 82,606 cases, while 37,266 cancer cases would result from the intake of contaminated food. With respect to the workers, who, according to the Fukushima operating company Tepco, were on duty in the damaged plant in 2011, IPPNW estimates on the basis of Chernobyl experiences that more than 17,000 of them will develop serious diseases. A few of the quantitative results of this study are subject to uncertainty, because some of the original data has only been published in an imprecise form and certain calculations involved making further assumptions necessary. Nevertheless, IPPNW has deemed it necessary to present this quantitative estimate in order to show clearly the true dimension of the Fukushima nuclear catastrophe. At present, there are numerous nuclear power plants operating at sites facing the potential risk of an earthquake all over the world. Many of them are much less able to withstand the force of an earthquake than nuclear reactors in Japan. Even a relatively weak earthquake could, at any time, trigger another nuclear catastrophe almost anywhere, in Asia, America, and also in Europe.
Manipulation of pain catastrophizing: An experimental study of healthy participants
Directory of Open Access Journals (Sweden)
Joel E Bialosky
2008-11-01
Full Text Available Joel E Bialosky1*, Adam T Hirsh2,3, Michael E Robinson2,3, Steven Z George1,3*1Department of Physical Therapy; 2Department of Clinical and Health Psychology; 3Center for Pain Research and Behavioral Health, University of Florida, Gainesville, Florida, USAAbstract: Pain catastrophizing is associated with the pain experience; however, causation has not been established. Studies which specifically manipulate catastrophizing are necessary to establish causation. The present study enrolled 100 healthy individuals. Participants were randomly assigned to repeat a positive, neutral, or one of three catastrophizing statements during a cold pressor task (CPT. Outcome measures of pain tolerance and pain intensity were recorded. No change was noted in catastrophizing immediately following the CPT (F(1,84 = 0.10, p = 0.75, partial η2 < 0.01 independent of group assignment (F(4,84 = 0.78, p = 0.54, partial η2 = 0.04. Pain tolerance (F(4 = 0.67, p = 0.62, partial η2 = 0.03 and pain intensity (F(4 = 0.73, p = 0.58, partial η2 = 0.03 did not differ by group. This study suggests catastrophizing may be difficult to manipulate through experimental pain procedures and repetition of specific catastrophizing statements was not sufficient to change levels of catastrophizing. Additionally, pain tolerance and pain intensity did not differ by group assignment. This study has implications for future studies attempting to experimentally manipulate pain catastrophizing.Keywords: pain, catastrophizing, experimental, cold pressor task, pain catastrophizing scale
Reliability analysis under epistemic uncertainty
International Nuclear Information System (INIS)
Nannapaneni, Saideep; Mahadevan, Sankaran
2016-01-01
This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.
Sorby, A.; Grossi, P.; Pomonis, A.; Williams, C.; Nyst, M.; Onur, T.; Seneviratna, P.; Baca, A.
2009-04-01
The management of catastrophe risk is concerned with the quantification of financial losses, and their associated probabilities, for potential future catastrophes that might impact a region. Modelling of historical catastrophe events and, in particular, the potential consequences if a similar event were to occur at the present day can provide insight to help bridge the gap between what we know can happen from historical experience and what potential losses might be out there in the "universe" of potential catastrophes. The 1908 Messina Earthquake (and accompanying local tsunami) was one of the most destructive earthquakes to have occurred in Europe and by most accounts remains Europe's most fatal with over 70,000 casualties estimated. However, what would the potential consequences be, in terms of financial and human losses, if a similar earthquake were to occur at the present day? Exposures, building stock and populations can change over time and, therefore, the consequences of a similar earthquake at the present day may sensibly differ from those observed in 1908. The city of Messina has been reconstructed several times in its history, including following the 1908 earthquake and again following the Second World War. The 1908 earthquake prompted the introduction of the first seismic design regulations in Italy and since 1909 parts of the Messina and Calabria regions have been in the zones of highest seismic coefficient. Utilizing commercial catastrophe loss modelling technology - which combines the modelling of hazard, vulnerability, and financial losses on a database of property exposures - a modelled earthquake scenario of M7.2 in the Messina Straits region of Southern Italy is considered. This modelled earthquake is used to assess the potential consequences in terms of financial losses that an earthquake similar to the 1908 earthquake might have if it were to occur at the present day. Loss results are discussed in the context of applications for the financial
Expected utility and catastrophic risk in a stochastic economy-climate model
Energy Technology Data Exchange (ETDEWEB)
Ikefuji, M. [Institute of Social and Economic Research, Osaka University, Osaka (Japan); Laeven, R.J.A.; Magnus, J.R. [Department of Econometrics and Operations Research, Tilburg University, Tilburg (Netherlands); Muris, C. [CentER, Tilburg University, Tilburg (Netherlands)
2010-11-15
In the context of extreme climate change, we ask how to conduct expected utility analysis in the presence of catastrophic risks. Economists typically model decision making under risk and uncertainty by expected utility with constant relative risk aversion (power utility); statisticians typically model economic catastrophes by probability distributions with heavy tails. Unfortunately, the expected utility framework is fragile with respect to heavy-tailed distributional assumptions. We specify a stochastic economy-climate model with power utility and explicitly demonstrate this fragility. We derive necessary and sufficient compatibility conditions on the utility function to avoid fragility and solve our stochastic economy-climate model for two examples of such compatible utility functions. We further develop and implement a procedure to learn the input parameters of our model and show that the model thus specified produces quite robust optimal policies. The numerical results indicate that higher levels of uncertainty (heavier tails) lead to less abatement and consumption, and to more investment, but this effect is not unlimited.
International Nuclear Information System (INIS)
Lo Bianco, A.S.; Oliveira, H.P.S.; Peixoto, J.G.P.
2009-01-01
To implant the primary standard of the magnitude kerma in the air for X-ray between 10 - 50 keV, the National Metrology Laboratory of Ionizing Radiations (LNMRI) must evaluate all the uncertainties of measurement related with Victtoren chamber. So, it was evaluated the uncertainty of the kerma in the air consequent of the inaccuracy in the active volume of the chamber using the calculation of Monte Carlo as a tool through the Penelope software
Empirical Bayes Credibility Models for Economic Catastrophic Losses by Regions
Directory of Open Access Journals (Sweden)
Jindrová Pavla
2017-01-01
Full Text Available Catastrophic events affect various regions of the world with increasing frequency and intensity. The number of catastrophic events and the amount of economic losses is varying in different world regions. Part of these losses is covered by insurance. Catastrophe events in last years are associated with increases in premiums for some lines of business. The article focus on estimating the amount of net premiums that would be needed to cover the total or insured catastrophic losses in different world regions using Bühlmann and Bühlmann-Straub empirical credibility models based on data from Sigma Swiss Re 2010-2016. The empirical credibility models have been developed to estimate insurance premiums for short term insurance contracts using two ingredients: past data from the risk itself and collateral data from other sources considered to be relevant. In this article we deal with application of these models based on the real data about number of catastrophic events and about the total economic and insured catastrophe losses in seven regions of the world in time period 2009-2015. Estimated credible premiums by world regions provide information how much money in the monitored regions will be need to cover total and insured catastrophic losses in next year.
Quantifying the hurricane catastrophe risk to offshore wind power.
Rose, Stephen; Jaramillo, Paulina; Small, Mitchell J; Apt, Jay
2013-12-01
The U.S. Department of Energy has estimated that over 50 GW of offshore wind power will be required for the United States to generate 20% of its electricity from wind. Developers are actively planning offshore wind farms along the U.S. Atlantic and Gulf coasts and several leases have been signed for offshore sites. These planned projects are in areas that are sometimes struck by hurricanes. We present a method to estimate the catastrophe risk to offshore wind power using simulated hurricanes. Using this method, we estimate the fraction of offshore wind power simultaneously offline and the cumulative damage in a region. In Texas, the most vulnerable region we studied, 10% of offshore wind power could be offline simultaneously because of hurricane damage with a 100-year return period and 6% could be destroyed in any 10-year period. We also estimate the risks to single wind farms in four representative locations; we find the risks are significant but lower than those estimated in previously published results. Much of the hurricane risk to offshore wind turbines can be mitigated by designing turbines for higher maximum wind speeds, ensuring that turbine nacelles can turn quickly to track the wind direction even when grid power is lost, and building in areas with lower risk. © 2013 Society for Risk Analysis.
Catastrophic disruptions as the origin of bilobate comets
Schwartz, Stephen R.; Michel, Patrick; Jutzi, Martin; Marchi, Simone; Zhang, Yun; Richardson, Derek C.
2018-05-01
Several comets observed at close range have bilobate shapes1, including comet 67P/Churyumov-Gerasimenko (67P/C-G), which was imaged by the European Space Agency's Rosetta mission2,3. Bilobate comets are thought to be primordial because they are rich in supervolatiles (for example, N2 and CO) and have a low bulk density, which implies that their formation requires a very low-speed accretion of two bodies. However, slow accretion does not only occur during the primordial phase of the Solar System; it can also occur at later epochs as part of the reaccumulation process resulting from the collisional disruption of a larger body4, so this cannot directly constrain the age of bilobate comets. Here, we show by numerical simulation that 67P/C-G and other elongated or bilobate comets can be formed in the wake of catastrophic collisional disruptions of larger bodies while maintaining their volatiles and low density throughout the process. Since this process can occur at any epoch of our Solar System's history, from early on through to the present day5, there is no need for these objects to be formed primordially. These findings indicate that observed prominent geological features, such as pits and stratified surface layers4,5, may not be primordial.
Murphy, Conor; Bastola, Satish; Sweeney, John
2013-04-01
Climate change impact and adaptation assessments have traditionally adopted a 'top-down' scenario based approach, where information from different Global Climate Models (GCMs) and emission scenarios are employed to develop impacts led adaptation strategies. Due to the tradeoffs in the computational cost and need to include a wide range of GCMs for fuller characterization of uncertainties, scenarios are better used for sensitivity testing and adaptation options appraisal. One common approach to adaptation that has been defined as robust is the use of safety margins. In this work the sensitivity of safety margins that have been adopted by the agency responsible for flood risk management in Ireland, to the uncertainty in future projections are examined. The sensitivity of fluvial flood risk to climate change is assessed for four Irish catchments using a large number of GCMs (17) forced with three emissions scenarios (SRES A1B, A2, B1) as input to four hydrological models. Both uncertainty within and between hydrological models is assessed using the GLUE framework. Regionalisation is achieved using a change factor method to infer changes in the parameters of a weather generator using monthly output from the GCMs, while flood frequency analysis is conducted using the method of probability weighted moments to fit the Generalised Extreme Value distribution to ~20,000 annual maxima series. The sensitivity of design margins to the uncertainty space considered is visualised using risk response surfaces. The hydrological sensitivity is measured as the percentage change in flood peak for specified recurrence intervals. Results indicate that there is a considerable residual risk associated with allowances of +20% when uncertainties are accounted for and that the risk of exceedence of design allowances is greatest for more extreme, low frequency events with considerable implication for critical infrastructure, e.g., culverts, bridges, flood defences whose designs are normally
International Nuclear Information System (INIS)
Kholosha, V.; Kovalchuk, V.
2003-01-01
The accident on Chernobyl NPP has affected the destiny of 35 million people in Ukraine. The social protection of the population affected during Chernobyl catastrophe is founded on the Law of Ukraine 'About the status and social protection of citizens affected owing to Chernobyl catastrophe' (see further - 'Law'), and is the principal direction of activity and the subject of the special state attention to total complex of problems bound to Chernobyl catastrophe consequences elimination. The current legislation stipulates partial compensation of material losses connected with resettlement of the affected population. According to the current legislation in Ukraine about 50 kinds of aid, privileges and compensations are submitted to the affected citizens
Catastrophic Cervical Spine Injuries in Contact Sports
Hutton, Michael James; McGuire, Robert A.; Dunn, Robert; Williams, Richard; Robertson, Peter; Twaddle, Bruce; Kiely, Patrick; Clarke, Andrew; Mazda, Keyvan; Davies, Paul; Pagarigan, Krystle T.; Dettori, Joseph R.
2016-01-01
Study Design Systematic review. Objectives To determine the incidence of catastrophic cervical spine injuries (CCSIs) among elite athletes participating in contact team sports and whether the incidence varies depending on the use of protective gear or by player position. Methods Electronic databases and reference lists of key articles published from January 1, 2000, to January 29, 2016, were searched. Results Fourteen studies were included that reported CCSI in rugby (n = 10), American football (n = 3), and Irish hurling (n = 1). Among Rugby Union players, incidence of CCSI was 4.1 per 100,000 player-hours. Among National Football League players, the CCSI rate was 0.6 per 100,000 player-exposures. At the collegiate level, the CCSI rate ranged from 1.1 to 4.7 per 100,000 player-years. Mixed populations of elite and recreational rugby players in four studies report a CCSI rate of 1.4 to 7.2 per 100,000 player-years. In this same population, the scrum accounted for 30 to 51% of total reported CCSIs in Rugby Union versus 0 to 4% in Rugby League. The tackle accounted for 29 to 39% of injuries in Rugby Union and 78 to 100% of injuries in Rugby League. Making a tackle was responsible for 29 to 80% of injuries in American football. Conclusion CCSIs are infrequent among elite athletes. There is insufficient evidence to determine the effect of protective gear (e.g., helmets, padding) on CCSI incidence. Scrum and tackle in rugby and tackling in American football account for the majority of CCSIs in each respective sport. PMID:27781193
Catastrophic antiphospholipid syndrome: a clinical review.
Nayer, Ali; Ortega, Luis M
2014-01-01
Catastrophic antiphospholipid syndrome (CAPS) is a rare life-threatening autoimmune disease characterized by disseminated intravascular thrombosis resulting in multiorgan failure. Directory of Open Access Journals (DOAJ), Google Scholar, PubMed (NLM), LISTA (EBSCO) and Web of Science have been searched. CAPS is due to antiphospholipid antibodies directed against a heterogeneous group of proteins that are associated with phospholipids. These autoantibodies activate endothelial cells, platelets, and immune cells, thereby promoting a proinflammatory and prothrombotic phenotype. Furthermore, antiphospholipid antibodies inhibit anticoagulants, impair fibrinolysis, and activate complements. Although CAPS can affect a variety of organs and tissues, the kidneys, lungs, central nervous system, heart, skin, liver, and gastrointestinal tract are most commonly affected. The systemic inflammatory response syndrome, likely to extensive tissue damage, accompanies CAPS. The most frequent renal manifestations are hypertension, proteinuria, hematuria, and acute renal failure.In the majority of patients with CAPS, a precipitating factor such as infection, surgery, or medication can be identified. Antiphospholipid antibodies such as lupus anticoagulant and antibodies against cardiolipin, β2-glycoprotein I, and prothrombin are serological hallmark of CAPS. Laboratory tests often reveal antinuclear antibodies, thrombocytopenia, and anemia. Despite widespread intravascular coagulation, blood films reveal only a small number of schistocytes. In addition, severe thrombocytopenia is uncommon. Histologically, CAPS is characterized by acute thrombotic microangiopathy. CAPS must be distinguished from other forms of thrombotic microangiopathies such as hemolytic-uremic syndrome, thrombotic thrombocytopenic purpura, disseminated intravascular coagulation, and heparin-induced thrombocyt openia. CAPS is associated with high morbidity and mortality. Therefore, an aggressive multidisciplinary
Catastrophic Cervical Spine Injuries in Contact Sports.
Hutton, Michael James; McGuire, Robert A; Dunn, Robert; Williams, Richard; Robertson, Peter; Twaddle, Bruce; Kiely, Patrick; Clarke, Andrew; Mazda, Keyvan; Davies, Paul; Pagarigan, Krystle T; Dettori, Joseph R
2016-11-01
Study Design Systematic review. Objectives To determine the incidence of catastrophic cervical spine injuries (CCSIs) among elite athletes participating in contact team sports and whether the incidence varies depending on the use of protective gear or by player position. Methods Electronic databases and reference lists of key articles published from January 1, 2000, to January 29, 2016, were searched. Results Fourteen studies were included that reported CCSI in rugby ( n = 10), American football ( n = 3), and Irish hurling ( n = 1). Among Rugby Union players, incidence of CCSI was 4.1 per 100,000 player-hours. Among National Football League players, the CCSI rate was 0.6 per 100,000 player-exposures. At the collegiate level, the CCSI rate ranged from 1.1 to 4.7 per 100,000 player-years. Mixed populations of elite and recreational rugby players in four studies report a CCSI rate of 1.4 to 7.2 per 100,000 player-years. In this same population, the scrum accounted for 30 to 51% of total reported CCSIs in Rugby Union versus 0 to 4% in Rugby League. The tackle accounted for 29 to 39% of injuries in Rugby Union and 78 to 100% of injuries in Rugby League. Making a tackle was responsible for 29 to 80% of injuries in American football. Conclusion CCSIs are infrequent among elite athletes. There is insufficient evidence to determine the effect of protective gear (e.g., helmets, padding) on CCSI incidence. Scrum and tackle in rugby and tackling in American football account for the majority of CCSIs in each respective sport.
Chernobyl: Endless horror. Late effects of the reactor catastrophe
International Nuclear Information System (INIS)
Roethlein, B.
1996-01-01
Ten years after the accident, the people of Chernobyl are trying to live a normal life, but the problems resulting from the catastrophe have not been solved. Some of them are just starting to emerge. (orig.) [de
Polarization catastrophe in nanostructures doped in photonic band gap materials
Energy Technology Data Exchange (ETDEWEB)
Singh, Mahi R. [Department of Physics and Astronomy, University of Western Ontario, London N6A 3K7 (Canada)], E-mail: msingh@uwo.ca
2008-11-30
In the presence of the dipole-dipole interaction, we have studied a possible dielectric catastrophe in photonic band gap materials doped with an ensemble of four-level nanoparticles. It is found that the dielectric constant of the system has a singularity when the resonance energy lies within the bands. This phenomenon is known as the dielectric catastrophe. It is also found that this phenomenon depends on the strength of the dipole-dipole interaction.
Special software for computing the special functions of wave catastrophes
Directory of Open Access Journals (Sweden)
Andrey S. Kryukovsky
2015-01-01
Full Text Available The method of ordinary differential equations in the context of calculating the special functions of wave catastrophes is considered. Complementary numerical methods and algorithms are described. The paper shows approaches to accelerate such calculations using capabilities of modern computing systems. Methods for calculating the special functions of wave catastrophes are considered in the framework of parallel computing and distributed systems. The paper covers the development process of special software for calculating of special functions, questions of portability, extensibility and interoperability.
Communications en cas de catastrophe faisant appel aux TIC pour ...
International Development Research Centre (IDRC) Digital Library (Canada)
Communications en cas de catastrophe faisant appel aux TIC pour les collectivités vulnérables des Caraïbes. De récents événements survenus dans les Caraïbes ont mis en relief les insuffisances des mesures régionales et nationales de préparation aux catastrophes. On manque particulièrement de systèmes d'alerte ...
The Effectiveness of Catastrophe Bonds in Portfolio Diversification
Mariani, Massimo; Amoruso, Paola
2016-01-01
The rapid growth of catastrophe bonds in financial markets is due to increasing environmental disasters and consequent economic losses, barely covered by insurance and reinsurance companies. These securities represent an effective solution, allowing the risk transfer to the capital market. The objective of this paper is to prove real advantages of the investor who operates in this market segment, in terms of portfolio diversification. The present work indeed shows how investing in catastrophe...
Pain Catastrophizing Correlates with Early Mild Traumatic Brain Injury Outcome
Directory of Open Access Journals (Sweden)
Geneviève Chaput
2016-01-01
Full Text Available Background. Identifying which patients are most likely to be at risk of chronic pain and other postconcussion symptoms following mild traumatic brain injury (MTBI is a difficult clinical challenge. Objectives. To examine the relationship between pain catastrophizing, defined as the exaggerated negative appraisal of a pain experience, and early MTBI outcome. Methods. This cross-sectional design included 58 patients diagnosed with a MTBI. In addition to medical chart review, postconcussion symptoms were assessed by self-report at 1 month (Time 1 and 8 weeks (Time 2 after MTBI. Pain severity, psychological distress, level of functionality, and pain catastrophizing were measured by self-report at Time 2. Results. The pain catastrophizing subscales of rumination, magnification, and helplessness were significantly correlated with pain severity (r=.31 to .44, number of postconcussion symptoms reported (r=.35 to .45, psychological distress (r=.57 to .67, and level of functionality (r=-.43 to -.29. Pain catastrophizing scores were significantly higher for patients deemed to be at high risk of postconcussion syndrome (6 or more symptoms reported at both Time 1 and Time 2. Conclusions. Higher levels of pain catastrophizing were related to adverse early MTBI outcomes. The early detection of pain catastrophizing may facilitate goal-oriented interventions to prevent or minimize the development of chronic pain and other postconcussion symptoms.
Understanding catastrophizing from a misdirected problem-solving perspective.
Flink, Ida K; Boersma, Katja; MacDonald, Shane; Linton, Steven J
2012-05-01
The aim is to explore pain catastrophizing from a problem-solving perspective. The links between catastrophizing, problem framing, and problem-solving behaviour are examined through two possible models of mediation as inferred by two contemporary and complementary theoretical models, the misdirected problem solving model (Eccleston & Crombez, 2007) and the fear-anxiety-avoidance model (Asmundson, Norton, & Vlaeyen, 2004). In this prospective study, a general population sample (n= 173) with perceived problems with spinal pain filled out questionnaires twice; catastrophizing and problem framing were assessed on the first occasion and health care seeking (as a proxy for medically oriented problem solving) was assessed 7 months later. Two different approaches were used to explore whether the data supported any of the proposed models of mediation. First, multiple regressions were used according to traditional recommendations for mediation analyses. Second, a bootstrapping method (n= 1000 bootstrap resamples) was used to explore the significance of the indirect effects in both possible models of mediation. The results verified the concepts included in the misdirected problem solving model. However, the direction of the relations was more in line with the fear-anxiety-avoidance model. More specifically, the mediation analyses provided support for viewing catastrophizing as a mediator of the relation between biomedical problem framing and medically oriented problem-solving behaviour. These findings provide support for viewing catastrophizing from a problem-solving perspective and imply a need to examine and address problem framing and catastrophizing in back pain patients. ©2011 The British Psychological Society.
Uncertainty analysis techniques
International Nuclear Information System (INIS)
Marivoet, J.; Saltelli, A.; Cadelli, N.
1987-01-01
The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site
Catastrophic antiphospholipid syndrome in obstetric practice
Directory of Open Access Journals (Sweden)
Валерий Николаевич Запорожан
2015-05-01
Full Text Available Thus, the Catastrophic antiphospholipid syndrome (CAPS is much more common than has been assumed until now, in all patients the authors strongly recommend screening for AFA. Furthermore, eclampsia, HELLP-syndrome premature detachment of normally located placentae (PDNSP can develop in the presence of other defects of hemostasis, in particular in mutation FV Leiden, MTHFR C677T, deficiency of protein C (PC, protein S (PS. The combination of acquired thrombophilia due to APS, with genetic defects worsen hemostasis during the pathological process leading to the development of thrombotic complications. Perhaps a combination of hereditary thrombophilia and APS creates a favorable environment in which, under certain conditions, possible decompensation of the hemostatic system and the development of CAPS. Patients with APS constitute a group of very high risk of thromboembolic complications in the perioperative period. Even a minimally invasive intervention (biopsy, curettage, tooth extraction may trigger the development of CAPS. Thus, according to Erkan et al. (2003, 40% of patients develop CAPS was provoked by surgery. The main reasons for the development of thrombotic complications in connection with surgical intervention is the damage to the vessel wall, blood stasis and the abolition of indirect anticoagulants. In the study on the presence of genetic thrombophilia was found heterozygous form of FV Leiden mutation and homozygous mutation of MTHFR C677T. He was diagnosed with pregnancy 14 weeks, APS, mixed form of thrombophilia (a combination of acquisitions and multigenic thrombophilia, hyperhomocysteinemia, weighed down by obstetric and somatic history.It is very urgent and important problem remains diagnosis CAPS, which is inconceivable without the determination of AFA. The latter should be mandatory for all pregnant women with preeclampsia habitual miscarriage, Premature detachment of normally situated placenta (PDNSP, genital herpes history
Characterizing Epistemic Uncertainty for Launch Vehicle Designs
Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad
2016-01-01
NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.
Directory of Open Access Journals (Sweden)
Darnall BD
2014-04-01
Full Text Available Beth D Darnall, John A Sturgeon, Ming-Chih Kao, Jennifer M Hah, Sean C MackeyDivision of Pain Medicine, Stanford Systems Neuroscience and Pain Laboratory, Stanford University School of Medicine, Palo Alto, CA, USABackground: Pain catastrophizing (PC – a pattern of negative cognitive-emotional responses to real or anticipated pain – maintains chronic pain and undermines medical treatments. Standard PC treatment involves multiple sessions of cognitive behavioral therapy. To provide efficient treatment, we developed a single-session, 2-hour class that solely treats PC entitled “From Catastrophizing to Recovery”[FCR].Objectives: To determine 1 feasibility of FCR; 2 participant ratings for acceptability, understandability, satisfaction, and likelihood to use the information learned; and 3 preliminary efficacy of FCR for reducing PC.Design and methods: Uncontrolled prospective pilot trial with a retrospective chart and database review component. Seventy-six patients receiving care at an outpatient pain clinic (the Stanford Pain Management Center attended the class as free treatment and 70 attendees completed and returned an anonymous survey immediately post-class. The Pain Catastrophizing Scale (PCS was administered at class check-in (baseline and at 2, and 4 weeks post-treatment. Within subjects repeated measures analysis of variance (ANOVA with Student's t-test contrasts were used to compare scores across time points.Results: All attendees who completed a baseline PCS were included as study participants (N=57; F=82%; mean age =50.2 years; PCS was completed by 46 participants at week 2 and 35 participants at week 4. Participants had significantly reduced PC at both time points (P<0001 and large effect sizes were found (Cohen's d=0.85 and d=1.15.Conclusion: Preliminary data suggest that FCR is an acceptable and effective treatment for PC. Larger, controlled studies of longer duration are needed to determine durability of response, factors
Lark, R. Murray
2014-05-01
Conventionally the uncertainty of a conventional soil map has been expressed in terms of the mean purity of its map units: the probability that the soil profile class examined at a site would be found to correspond to the eponymous class of the simple map unit that is delineated there (Burrough et al, 1971). This measure of uncertainty has an intuitive meaning and is used for quality control in soil survey contracts (Western, 1978). However, it may be of limited value to the manager or policy maker who wants to decide whether the map provides a basis for decision making, and whether the cost of producing a better map would be justified. In this study I extend a published analysis of the economic implications of uncertainty in a soil map (Giasson et al., 2000). A decision analysis was developed to assess the economic value of imperfect soil map information for agricultural land use planning. Random error matrices for the soil map units were then generated, subject to constraints which ensure consistency with fixed frequencies of the different soil classes. For each error matrix the mean map unit purity was computed, and the value of the implied imperfect soil information was computed by the decision analysis. An alternative measure of the uncertainty in a soil map was considered. This is the mean soil map information which is the difference between the information content of a soil observation, at a random location in the region, and the information content of a soil observation given that the map unit is known. I examined the relationship between the value of imperfect soil information and the purity and information measures of map uncertainty. In both cases there was considerable variation in the economic value of possible maps with fixed values of the uncertainty measure. However, the correlation was somewhat stronger with the information measure, and there was a clear upper bound on the value of an imperfect soil map when the mean information takes some
Vaginismus: heightened harm avoidance and pain catastrophizing cognitions.
Borg, Charmaine; Peters, Madelon L; Schultz, Willibrord Weijmar; de Jong, Peter J
2012-02-01
Catastrophic appraisal of experienced pain may promote hypervigilance and intense pain, while the personality trait of harm avoidance (HA) might prevent the occurrence of correcting such experiences. Women inflicted with vaginismus may enter a self-perpetuating downward spiral of increasing avoidance of (anticipated) pain. In vaginismus the anticipation of pain may give rise to catastrophic pain ideation. This may establish hypervigilance toward painful sexual stimuli, which consequently results in negative appraisal of sexual cues. This process could impair genital and sexual responding, intensify pain and trigger avoidance, which in turn may contribute to the onset and persistence of symptoms in vaginismus and to certain extent also in dyspareunia. To investigate whether women suffering from vaginismus are characterized by heightened levels of habitual pain catastrophic cognitions, together with higher levels of HA. This study consisted of three groups: a lifelong vaginismus group (N = 35, mean age = 28.4; standard deviation [SD] = 5.8), a dyspareunia group (N = 33, mean age = 26.7; SD = 6.8), and women without sexual complaints (N = 54, mean age = 26.5; SD = 6.7). HA scale of Cloninger's tridimensional personality questionnaire, and the pain catastrophizing scale. Specifically women inflicted with vaginismus showed significantly heightened levels of catastrophic pain cognitions compared with the other two groups, as well as significant enhanced HA vs. the control group, and a trend vs. the dyspareunia group. Both traits were shown to have cumulative predictive validity for the presence of vaginismus. This study focused on the personality traits of catastrophizing pain cognitions and HA in women with lifelong vaginismus. Our findings showed that indeed, women suffering from vaginismus are characterized by trait of HA interwoven with habitual pain catastrophizing cognitions. This study could help in the refinement of the current conceptualization and might shed
International Nuclear Information System (INIS)
Landsberg, P.T.
1990-01-01
This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)
Numerical solution of dynamic equilibrium models under Poisson uncertainty
DEFF Research Database (Denmark)
Posch, Olaf; Trimborn, Timo
2013-01-01
We propose a simple and powerful numerical algorithm to compute the transition process in continuous-time dynamic equilibrium models with rare events. In this paper we transform the dynamic system of stochastic differential equations into a system of functional differential equations of the retar...... solution to Lucas' endogenous growth model under Poisson uncertainty are used to compute the exact numerical error. We show how (potential) catastrophic events such as rare natural disasters substantially affect the economic decisions of households....
Nadar, M Y; Akar, D K; Rao, D D; Kulkarni, M S; Pradeepkumar, K S
2015-12-01
Assessment of intake due to long-lived actinides by inhalation pathway is carried out by lung monitoring of the radiation workers inside totally shielded steel room using sensitive detection systems such as Phoswich and an array of HPGe detectors. In this paper, uncertainties in the lung activity estimation due to positional errors, chest wall thickness (CWT) and detector background variation are evaluated. First, calibration factors (CFs) of Phoswich and an array of three HPGe detectors are estimated by incorporating ICRP male thorax voxel phantom and detectors in Monte Carlo code 'FLUKA'. CFs are estimated for the uniform source distribution in lungs of the phantom for various photon energies. The variation in the CFs for positional errors of ±0.5, 1 and 1.5 cm in horizontal and vertical direction along the chest are studied. The positional errors are also evaluated by resizing the voxel phantom. Combined uncertainties are estimated at different energies using the uncertainties due to CWT, detector positioning, detector background variation of an uncontaminated adult person and counting statistics in the form of scattering factors (SFs). SFs are found to decrease with increase in energy. With HPGe array, highest SF of 1.84 is found at 18 keV. It reduces to 1.36 at 238 keV. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Harré, Michael S.
2013-02-01
Two aspects of modern economic theory have dominated the recent discussion on the state of the global economy: Crashes in financial markets and whether or not traditional notions of economic equilibrium have any validity. We have all seen the consequences of market crashes: plummeting share prices, businesses collapsing and considerable uncertainty throughout the global economy. This seems contrary to what might be expected of a system in equilibrium where growth dominates the relatively minor fluctuations in prices. Recent work from within economics as well as by physicists, psychologists and computational scientists has significantly improved our understanding of the more complex aspects of these systems. With this interdisciplinary approach in mind, a behavioural economics model of local optimisation is introduced and three general properties are proven. The first is that under very specific conditions local optimisation leads to a conventional macro-economic notion of a global equilibrium. The second is that if both global optimisation and economic growth are required then under very mild assumptions market catastrophes are an unavoidable consequence. Third, if only local optimisation and economic growth are required then there is sufficient parametric freedom for macro-economic policy makers to steer an economy around catastrophes without overtly disrupting local optimisation.
Catastrophes et consommation des substances psychoactives
Directory of Open Access Journals (Sweden)
Krivokapić Žilijeta
2009-01-01
Full Text Available (francuski Les catastrophes, les accidents, les stress, les traumatismes sont des expériences négatives de vie accompagnées de changements physiologiques, cognitifs, émotionnels et comportementaux. Les stratégies les plus courantes inefficaces à résoudre les expériences de vie négatives sont: l' agression - ouverte (physique et / ou verbale, passive et latente, le retrait social, le placage, la dépression, l'impuissance, l' isolement et l' abus de médicaments, en particulier de drogues. Les personnes se trouvant dans des situations stressantes essayent de s' aider elles-mêmes souvent en recourant à des substances qui procurent une amélioration de leur état et suppriment le malaise momentané. Cette 'thérapie par auto-thérapie' comporte de graves risques La personne qui, après une période de consommation de ces substances devenaient dépendante, manifeste des changements visibles au niveau physique et psychologique. Elle se dérobe à ses obligations, rompt avec les activités auxquelles elle prenait plaisir autrefois de même qu' avec ses loisirs et ses intérêts; elle change des amis, ses relations familiales et amicales deviennent pauvres et remplies de nombreux conflits; elle devient moins critique et plus manipulative, commence à mentir, trompe pour dissimuler sa toxicomanie, rejoint un groupe de ses semblables, se livre à des activités criminogènes, de plus en plus se dégrade physiquement. L' alcool qui, étant le plus accessible et par conséquent généralement 'la première mesure d' auto-thérapie', a un impact particulièrement dévastateur sur l' organisme sensible au point de vue psycho-physique. Nous assistons à de nombreuses difficultés et des problèmes qui, à la suite de la consommation d' alcool, aggravent ceux liés à des expériences des événements traumatisants. De même l' efficacité de certains comprimés de réduire les tensions ou d' améliorer l' état du patient conduit fréquemment
Ngada, Narcisse
2015-06-15
The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.
The uncertainties in estimating measurement uncertainties
International Nuclear Information System (INIS)
Clark, J.P.; Shull, A.H.
1994-01-01
All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties
Forecasting giant, catastrophic slope collapse: lessons from Vajont, Northern Italy
Kilburn, Christopher R. J.; Petley, David N.
2003-08-01
Rapid, giant landslides, or sturzstroms, are among the most powerful natural hazards on Earth. They have minimum volumes of ˜10 6-10 7 m 3 and, normally preceded by prolonged intervals of accelerating creep, are produced by catastrophic and deep-seated slope collapse (loads ˜1-10 MPa). Conventional analyses attribute rapid collapse to unusual mechanisms, such as the vaporization of ground water during sliding. Here, catastrophic collapse is related to self-accelerating rock fracture, common in crustal rocks at loads ˜1-10 MPa and readily catalysed by circulating fluids. Fracturing produces an abrupt drop in resisting stress. Measured stress drops in crustal rock account for minimum sturzstrom volumes and rapid collapse accelerations. Fracturing also provides a physical basis for quantitatively forecasting catastrophic slope failure.
Dynamical systems V bifurcation theory and catastrophe theory
1994-01-01
Bifurcation theory and catastrophe theory are two of the best known areas within the field of dynamical systems. Both are studies of smooth systems, focusing on properties that seem to be manifestly non-smooth. Bifurcation theory is concerned with the sudden changes that occur in a system when one or more parameters are varied. Examples of such are familiar to students of differential equations, from phase portraits. Moreover, understanding the bifurcations of the differential equations that describe real physical systems provides important information about the behavior of the systems. Catastrophe theory became quite famous during the 1970's, mostly because of the sensation caused by the usually less than rigorous applications of its principal ideas to "hot topics", such as the characterization of personalities and the difference between a "genius" and a "maniac". Catastrophe theory is accurately described as singularity theory and its (genuine) applications. The authors of this book, the first printing of w...
Stagewise cognitive development: an application of catastrophe theory.
van der Maas, H L; Molenaar, P C
1992-07-01
In this article an overview is given of traditional methodological approaches to stagewise cognitive developmental research. These approaches are evaluated and integrated on the basis of catastrophe theory. In particular, catastrophe theory specifies a set of common criteria for testing the discontinuity hypothesis proposed by Piaget. Separate criteria correspond to distinct methods used in cognitive developmental research. Such criteria are, for instance, the detection of spurts in development, bimodality of test scores, and increased variability of responses during transitional periods. When a genuine stage transition is present, these criteria are expected to be satisfied. A revised catastrophe model accommodating these criteria is proposed for the stage transition in cognitive development from the preoperational to the concrete operational stage.
Socioeconomic inequality in catastrophic health expenditure in Brazil.
Boing, Alexandra Crispim; Bertoldi, Andréa Dâmaso; Barros, Aluísio Jardim Dornellas de; Posenato, Leila Garcia; Peres, Karen Glazer
2014-08-01
To analyze the evolution of catastrophic health expenditure and the inequalities in such expenses, according to the socioeconomic characteristics of Brazilian families. Data from the National Household Budget 2002-2003 (48,470 households) and 2008-2009 (55,970 households) were analyzed. Catastrophic health expenditure was defined as excess expenditure, considering different methods of calculation: 10.0% and 20.0% of total consumption and 40.0% of the family's capacity to pay. The National Economic Indicator and schooling were considered as socioeconomic characteristics. Inequality measures utilized were the relative difference between rates, the rates ratio, and concentration index. The catastrophic health expenditure varied between 0.7% and 21.0%, depending on the calculation method. The lowest prevalences were noted in relation to the capacity to pay, while the highest, in relation to total consumption. The prevalence of catastrophic health expenditure increased by 25.0% from 2002-2003 to 2008-2009 when the cutoff point of 20.0% relating to the total consumption was considered and by 100% when 40.0% or more of the capacity to pay was applied as the cut-off point. Socioeconomic inequalities in the catastrophic health expenditure in Brazil between 2002-2003 and 2008-2009 increased significantly, becoming 5.20 times higher among the poorest and 4.17 times higher among the least educated. There was an increase in catastrophic health expenditure among Brazilian families, principally among the poorest and those headed by the least-educated individuals, contributing to an increase in social inequality.
Model uncertainties in top-quark physics
Seidel, Markus
2014-01-01
The ATLAS and CMS collaborations at the Large Hadron Collider (LHC) are studying the top quark in pp collisions at 7 and 8 TeV. Due to the large integrated luminosity, precision measurements of production cross-sections and properties are often limited by systematic uncertainties. An overview of the modeling uncertainties for simulated events is given in this report.
Repeated checking induces uncertainty about future threat
Giele, C.L.|info:eu-repo/dai/nl/318754460; Engelhard, I.M.|info:eu-repo/dai/nl/239681533; van den Hout, M.A.|info:eu-repo/dai/nl/070445354; Dek, E.C.P.|info:eu-repo/dai/nl/313959552; Damstra, Marianne; Douma, Ellen
2015-01-01
Studies have shown that obsessive-compulsive (OC) -like repeated checking paradoxically increases memory uncertainty. This study tested if checking also induces uncertainty about future threat by impairing the distinction between danger and safety cues. Participants (n = 54) engaged in a simulated
Predictive uncertainty in auditory sequence processing
DEFF Research Database (Denmark)
Hansen, Niels Chr.; Pearce, Marcus T
2014-01-01
in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models...
Clinical Pain Catastrophizing in Women With Migraine and Obesity.
Bond, Dale S; Buse, Dawn C; Lipton, Richard B; Thomas, J Graham; Rathier, Lucille; Roth, Julie; Pavlovic, Jelena M; Evans, E Whitney; Wing, Rena R
2015-01-01
Obesity is related to migraine. Maladaptive pain coping strategies (eg, pain catastrophizing) may provide insight into this relationship. In women with migraine and obesity, we cross-sectionally assessed: (1) prevalence of clinical catastrophizing; (2) characteristics of those with and without clinical catastrophizing; and (3) associations of catastrophizing with headache features. Obese women migraineurs seeking weight loss treatment (n = 105) recorded daily migraine activity for 1 month via smartphone and completed the Pain Catastrophizing Scale (PCS). Clinical catastrophizing was defined as total PCS score ≥30. The six-item Headache Impact Test (HIT-6), 12-item Allodynia Symptom Checklist (ASC-12), Headache Management Self-Efficacy Scale (HMSE), and assessments for depression (Centers for Epidemiologic Studies Depression Scale) and anxiety (seven-item Generalized Anxiety Disorder Scale) were also administered. Using PCS scores and body mass index (BMI) as predictors in linear regression, we modeled a series of headache features (ie, headache days, HIT-6, etc) as outcomes. One quarter (25.7%; 95% confidence interval [CI] = 17.2-34.1%) of participants met criteria for clinical catastrophizing: they had higher BMI (37.9 ± 7.5 vs 34.4 ± 5.7 kg/m(2) , P = .035); longer migraine attack duration (160.8 ± 145.0 vs 97.5 ± 75.2 hours/month, P = .038); higher HIT-6 scores (68.7 ± 4.6 vs 64.5 ± 3.9, P duration (β = 0.390, P duration, higher pain sensitivity, greater headache impact, and lower headache management self-efficacy. In all participants, PCS scores were related to several migraine characteristics, above and beyond the effects of obesity. Prospective studies are needed to determine sequence and mechanisms of relationships between catastrophizing, obesity, and migraine. © 2015 American Headache Society.
Numerical Experiments Based on the Catastrophe Model of Solar Eruptions
Xie, X. Y.; Ziegler, U.; Mei, Z. X.; Wu, N.; Lin, J.
2017-11-01
On the basis of the catastrophe model developed by Isenberg et al., we use the NIRVANA code to perform the magnetohydrodynamics (MHD) numerical experiments to look into various behaviors of the coronal magnetic configuration that includes a current-carrying flux rope used to model the prominence levitating in the corona. These behaviors include the evolution in equilibrium heights of the flux rope versus the change in the background magnetic field, the corresponding internal equilibrium of the flux rope, dynamic properties of the flux rope after the system loses equilibrium, as well as the impact of the referential radius on the equilibrium heights of the flux rope. In our calculations, an empirical model of the coronal density distribution given by Sittler & Guhathakurta is used, and the physical diffusion is included. Our experiments show that the deviation of simulations in the equilibrium heights from the theoretical results exists, but is not apparent, and the evolutionary features of the two results are similar. If the flux rope is initially locate at the stable branch of the theoretical equilibrium curve, the flux rope will quickly reach the equilibrium position in the simulation after several rounds of oscillations as a result of the self-adjustment of the system; and the flux rope lose the equilibrium if the initial location of the flux rope is set at the critical point on the theoretical equilibrium curve. Correspondingly, the internal equilibrium of the flux rope can be reached as well, and the deviation from the theoretical results is somewhat apparent since the approximation of the small radius of the flux rope is lifted in our experiments, but such deviation does not affect the global equilibrium in the system. The impact of the referential radius on the equilibrium heights of the flux rope is consistent with the prediction of the theory. Our calculations indicate that the motion of the flux rope after the loss of equilibrium is consistent with which
Uncertainty in social dilemmas
Kwaadsteniet, Erik Willem de
2007-01-01
This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...
Pain catastrophizing predicts verbal expression among children with chronic pain and their mothers
Directory of Open Access Journals (Sweden)
Shelby L Langer
2016-03-01
Full Text Available This study examined intra- and inter-personal associations between pain catastrophizing and verbal expression in 70 children with recurrent abdominal pain and their mothers. Participants independently completed the Pain Catastrophizing Scale. Mothers and children then talked about the child’s pain. Speech was categorized using a linguistic analysis program. Catastrophizing was positively associated with the use of negative emotion words by both mothers and children. In addition, mothers’ catastrophizing was positively associated with both mothers’ and children’s anger word usage, whereas children’s catastrophizing was inversely associated with mothers’ anger word usage. Findings extend the literature on behavioral and interpersonal aspects of catastrophizing.
Urban drainage models - making uncertainty analysis simple
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana
2012-01-01
in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...
Three Solvable Matrix Models of a Quantum Catastrophe
Czech Academy of Sciences Publication Activity Database
Levai, G.; Růžička, František; Znojil, Miloslav
2014-01-01
Roč. 53, č. 9 (2014), s. 2875-2890 ISSN 0020-7748 Institutional support: RVO:61389005 Keywords : quantum theory * PT symmetry * Finite-dimensional non-Hermitian Hamiltonians * exceptional-point localization * quantum theory of catastrophes * methods of computer algebra Subject RIV: BE - Theoretical Physics Impact factor: 1.184, year: 2014
Catastrophe Theory: A Unified Model for Educational Change.
Cryer, Patricia; Elton, Lewis
1990-01-01
Catastrophe Theory and Herzberg's theory of motivation at work was used to create a model of change that unifies and extends Lewin's two separate stage and force field models. This new model is used to analyze the behavior of academics as they adapt to the changing university environment. (Author/MLW)
The application of catastrophe theory to medical image analysis
Kuijper, A.; Florack, L.M.J.
2001-01-01
In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of critical points under the influence of blurring. We show how the mathematical framework of catastrophe theory can be used to describe the various different types of annihilations and
The Application of Catastrophe Theory to Medical Image Analysis
Kuijper, Arjan; Florack, L.M.J.
2001-01-01
In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of critical points under the influence of blurring. We show how the mathematical framework of catastrophe theory can be used to describe the various different types of
The application of catastrophe theory to image analysis
Kuijper, A.; Florack, L.M.J.
2001-01-01
In order to investigate the deep structure of Gaussian scale space images, one needs to understand the behaviour of critical points under the inuence of blurring. We show how the mathematical framework of catastrophe theory can be used to describe the various different types of annihilations and the
Vaginismus : Heightened Harm Avoidance and Pain Catastrophizing Cognitions
Borg, Charmaine; Peters, Madelon L.; Schultz, Willibrord Weijmar; de Jong, Peter J.
Introduction. Catastrophic appraisal of experienced pain may promote hypervigilance and intense pain, while the personality trait of harm avoidance (HA) might prevent the occurrence of correcting such experiences. Women inflicted with vaginismus may enter a self-perpetuating downward spiral of
Concrete ensemble Kalman filters with rigorous catastrophic filter divergence.
Kelly, David; Majda, Andrew J; Tong, Xin T
2015-08-25
The ensemble Kalman filter and ensemble square root filters are data assimilation methods used to combine high-dimensional, nonlinear dynamical models with observed data. Ensemble methods are indispensable tools in science and engineering and have enjoyed great success in geophysical sciences, because they allow for computationally cheap low-ensemble-state approximation for extremely high-dimensional turbulent forecast models. From a theoretical perspective, the dynamical properties of these methods are poorly understood. One of the central mysteries is the numerical phenomenon known as catastrophic filter divergence, whereby ensemble-state estimates explode to machine infinity, despite the true state remaining in a bounded region. In this article we provide a breakthrough insight into the phenomenon, by introducing a simple and natural forecast model that transparently exhibits catastrophic filter divergence under all ensemble methods and a large set of initializations. For this model, catastrophic filter divergence is not an artifact of numerical instability, but rather a true dynamical property of the filter. The divergence is not only validated numerically but also proven rigorously. The model cleanly illustrates mechanisms that give rise to catastrophic divergence and confirms intuitive accounts of the phenomena given in past literature.
The Chernobyl Catastrophe. Consequences on Human Health
Energy Technology Data Exchange (ETDEWEB)
Yablokov, A.; Labunska, I.; Blokov, I. (eds.)
2006-04-15
of estimates of excess mortality resulting from the Chernobyl accident spans an extremely wide range depending upon precisely what is taken into account. The most recent epidemiological evidence, published under the auspices of the Russian Academy of Sciences, suggests that the scale of the problems could be very much greater than predicted by studies published to date. For example, the 2005 IAEA report predicted that 4000 additional deaths would result from the Chernobyl accident. The most recently published figures indicate that in Belarus, Russia and the Ukraine alone the accident resulted in an estimated 200,000 additional deaths between 1990 and 2004. Overall, the available data reveal a considerable range in estimated excess mortalities resulting from the Chernobyl accident, serving to underline the huge uncertainties attached to knowledge of the full impacts of the Chernobyl accident. This report includes some data, which have not been published before in the international arena. In combination with the extensive body of literature which has been published to date, these data indicate that 'official' figures (e.g. the IAEA 2005 evaluation) for morbidity (incidence of disease) and death arising as a direct result of the radioactive contamination released from Chernobyl may grossly underestimate both the local and international impact of the incident. Four population groups appear to have experienced the most severe health effects: (1) accident clean-up workers, or 'liquidators', including civilian and the military personnel drafted to carry out clean-up activities and construct the protective cover for the reactor; (2) evacuees from dangerously contaminated territories inside the 30-km zone around the power plant; (3) residents of the less (but still dangerously) contaminated territories; and (4) children born into the families from all of the above three groups.
Uncertainties in Safety Analysis. A literature review
International Nuclear Information System (INIS)
Ekberg, C.
1995-05-01
The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs
Uncertainties in Safety Analysis. A literature review
Energy Technology Data Exchange (ETDEWEB)
Ekberg, C [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry
1995-05-01
The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.
Community resilience and decision theory challenges for catastrophic events.
Cox, Louis Anthony
2012-11-01
Extreme and catastrophic events pose challenges for normative models of risk management decision making. They invite development of new methods and principles to complement existing normative decision and risk analysis. Because such events are rare, it is difficult to learn about them from experience. They can prompt both too little concern before the fact, and too much after. Emotionally charged and vivid outcomes promote probability neglect and distort risk perceptions. Aversion to acting on uncertain probabilities saps precautionary action; moral hazard distorts incentives to take care; imperfect learning and social adaptation (e.g., herd-following, group-think) complicate forecasting and coordination of individual behaviors and undermine prediction, preparation, and insurance of catastrophic events. Such difficulties raise substantial challenges for normative decision theories prescribing how catastrophe risks should be managed. This article summarizes challenges for catastrophic hazards with uncertain or unpredictable frequencies and severities, hard-to-envision and incompletely described decision alternatives and consequences, and individual responses that influence each other. Conceptual models and examples clarify where and why new methods are needed to complement traditional normative decision theories for individuals and groups. For example, prospective and retrospective preferences for risk management alternatives may conflict; procedures for combining individual beliefs or preferences can produce collective decisions that no one favors; and individual choices or behaviors in preparing for possible disasters may have no equilibrium. Recent ideas for building "disaster-resilient" communities can complement traditional normative decision theories, helping to meet the practical need for better ways to manage risks of extreme and catastrophic events. © 2012 Society for Risk Analysis.
Uncertainty quantification theory, implementation, and applications
Smith, Ralph C
2014-01-01
The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...
Uncertainty Quantification in Numerical Aerodynamics
Litvinenko, Alexander
2017-05-16
We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.
Catastrophic Events Caused by Cosmic Objects
Adushkin, Vitaly
2008-01-01
Many times all of us could hear from mass media that an asteroid approached and swept past the Earth. Such an asteroid or comet will inevitably strike the planet some day. This volume considers hazards due to collisions with cosmic objects, particularly in light of recent investigations of impacts by the authors. Each chapter written by an expert contains an overview of an aspect and new findings in the field. The main hazardous effects – cratering, shock, aerial and seismic waves, fires, ejection of dust and soot, tsunami are described and numerically estimated. Numerical simulations of impacts and impact consequences have received much attention in the book. Fairly small impacting objects 50 -100 m in diameter pose a real threat to humanity and their influence on the atmosphere and ionosphere is emphasized. Especially vulnerable are industrially developed areas with dense population, almost all Europe is one of them. Special chapters are devoted to the famous 1908 Tunguska event and new results of its sim...
Welkenhuysen, Kris; Rupert, Jort; Compernolle, Tine; Ramirez, Andrea|info:eu-repo/dai/nl/284852414; Swennen, Rudy; Piessens, Kris
2017-01-01
The use of anthropogenic CO2 for enhancing oil recovery from mature oil fields in the North Sea has several potential benefits, and a number of assessments have been conducted. It remains, however, difficult to realistically simulate the economic circumstances and decisions, while including the
Miró, Jordi; Nieto, Rubén; Huguet, Anna
2008-05-01
The main aims of this work were to test the psychometric properties of the Catalan version of the Pain Catastrophizing Scale (PCS) and to assess the usefulness of the scale when used with whiplash patients. This article reports results from 2 complementary studies. In the first one, the PCS was administered to 280 students and 146 chronic pain patients to examine the psychometric properties of a new Catalan version of the instrument. A confirmatory factor analysis supported a second-order structure, in which 3 second-order factors (ie, rumination, helplessness, and magnification) load in a higher-order factor (ie, catastrophizing). The reliability of the Catalan version was supported by an acceptable internal consistency and test-retest values. Validity was supported by the correlations found among the PCS and pain intensity, pain interference, and depression. The objective of the second study was to evaluate the PCS when used with whiplash patients. In this second study, 141 patients with whiplash disorders participated. In general, the psychometric properties of the PCS were found appropriate, with factor analysis supporting the structure described in patients with chronic pain. Our data suggest that the PCS is a good instrument to assess catastrophic thinking in whiplash patients. The usefulness of the PCS in whiplash disorders has been explored in this study. Results of our work show that the PCS can be a very useful tool to assess catastrophic thinking about pain in whiplash patients.
Reusable launch vehicle model uncertainties impact analysis
Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng
2018-03-01
Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).
Catastrophic antiphospholipid syndrome mimicking a malignant pancreatic tumour--a case report
van Wissen, S.; Bastiaansen, B. A. J.; Stroobants, A. K.; van den Dool, E. J.; Idu, M. M.; Levi, M. [=Marcel M.; Stroes, E. S. G.
2008-01-01
The catastrophic antiphospholipid syndrome is characterised by rapid onset thromboses, often resistant to conventional anticoagulant treatment, and resulting in life threatening multiple organ dysfunction. The diagnosis of catastrophic antiphospholipid syndrome may be difficult, predominantly due to
Verguet, Stéphane; Riumallo-Herl, Carlos; Gomez, Gabriela B.; Menzies, Nicolas A.; Houben, Rein M. G. J.; Sumner, Tom; Lalli, Marek; White, Richard G.; Salomon, Joshua A.; Cohen, Ted; Foster, Nicola; Chatterjee, Susmita; Sweeney, Sedona; Baena, Inés Garcia; Lönnroth, Knut; Weil, Diana E.; Vassall, Anna
2017-01-01
The economic burden on households affected by tuberculosis through costs to patients can be catastrophic. WHO's End TB Strategy recognises and aims to eliminate these potentially devastating economic effects. We assessed whether aggressive expansion of tuberculosis services might reduce catastrophic
The Biermann Catastrophe in Numerical Magnetohydrodynamics
Graziani, Carlo; Tzeferacos, Petros; Lee, Dongwook; Lamb, Donald Q.; Weide, Klaus; Fatenejad, Milad; Miller, Joshua
2015-03-01
The Biermann battery effect is frequently invoked in cosmic magnetogenesis and studied in high-energy density laboratory physics experiments. Generation of magnetic fields by the Biermann effect due to misaligned density and temperature gradients in smooth flow behind shocks is well known. We show that a Biermann-effect magnetic field is also generated within shocks. Direct implementation of the Biermann effect in MHD codes does not capture this physical process, and worse, it produces unphysical magnetic fields at shocks whose value does not converge with resolution. We show that this convergence breakdown is due to naive discretization, which fails to account for the fact that discretized irrotational vector fields have spurious solenoidal components that grow without bound near a discontinuity. We show that careful consideration of the kinetics of ion viscous shocks leads to a formulation of the Biermann effect that gives rise to a convergent algorithm. We note two novel physical effects: a resistive magnetic precursor, in which a Biermann-generated field in the shock “leaks” resistively upstream, and a thermal magnetic precursor, in which a field is generated by the Biermann effect ahead of the shock front owing to gradients created by the shock’s electron thermal conduction precursor. Both effects appear to be potentially observable in experiments at laser facilities. We reexamine published studies of magnetogenesis in galaxy cluster formation and conclude that the simulations in question had inadequate resolution to reliably estimate the field generation rate. Corrected estimates suggest primordial field values in the range B˜ {{10}-22}-10-19 G by z = 3.
Uncertainty in hydrological change modelling
DEFF Research Database (Denmark)
Seaby, Lauren Paige
applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...
Gunardi, Setiawan, Ezra Putranda
2015-12-01
Indonesia is a country with high risk of earthquake, because of its position in the border of earth's tectonic plate. An earthquake could raise very high amount of damage, loss, and other economic impacts. So, Indonesia needs a mechanism for transferring the risk of earthquake from the government or the (reinsurance) company, as it could collect enough money for implementing the rehabilitation and reconstruction program. One of the mechanisms is by issuing catastrophe bond, `act-of-God bond', or simply CAT bond. A catastrophe bond issued by a special-purpose-vehicle (SPV) company, and then sold to the investor. The revenue from this transaction is joined with the money (premium) from the sponsor company and then invested in other product. If a catastrophe happened before the time-of-maturity, cash flow from the SPV to the investor will discounted or stopped, and the cash flow is paid to the sponsor company to compensate their loss because of this catastrophe event. When we consider the earthquake only, the amount of discounted cash flow could determine based on the earthquake's magnitude. A case study with Indonesian earthquake magnitude data show that the probability of maximum magnitude can model by generalized extreme value (GEV) distribution. In pricing this catastrophe bond, we assumed stochastic interest rate that following the Cox-Ingersoll-Ross (CIR) interest rate model. We develop formulas for pricing three types of catastrophe bond, namely zero coupon bonds, `coupon only at risk' bond, and `principal and coupon at risk' bond. Relationship between price of the catastrophe bond and CIR model's parameter, GEV's parameter, percentage of coupon, and discounted cash flow rule then explained via Monte Carlo simulation.
International Nuclear Information System (INIS)
Gunardi,; Setiawan, Ezra Putranda
2015-01-01
Indonesia is a country with high risk of earthquake, because of its position in the border of earth’s tectonic plate. An earthquake could raise very high amount of damage, loss, and other economic impacts. So, Indonesia needs a mechanism for transferring the risk of earthquake from the government or the (reinsurance) company, as it could collect enough money for implementing the rehabilitation and reconstruction program. One of the mechanisms is by issuing catastrophe bond, ‘act-of-God bond’, or simply CAT bond. A catastrophe bond issued by a special-purpose-vehicle (SPV) company, and then sold to the investor. The revenue from this transaction is joined with the money (premium) from the sponsor company and then invested in other product. If a catastrophe happened before the time-of-maturity, cash flow from the SPV to the investor will discounted or stopped, and the cash flow is paid to the sponsor company to compensate their loss because of this catastrophe event. When we consider the earthquake only, the amount of discounted cash flow could determine based on the earthquake’s magnitude. A case study with Indonesian earthquake magnitude data show that the probability of maximum magnitude can model by generalized extreme value (GEV) distribution. In pricing this catastrophe bond, we assumed stochastic interest rate that following the Cox-Ingersoll-Ross (CIR) interest rate model. We develop formulas for pricing three types of catastrophe bond, namely zero coupon bonds, ‘coupon only at risk’ bond, and ‘principal and coupon at risk’ bond. Relationship between price of the catastrophe bond and CIR model’s parameter, GEV’s parameter, percentage of coupon, and discounted cash flow rule then explained via Monte Carlo simulation
Energy Technology Data Exchange (ETDEWEB)
Gunardi,; Setiawan, Ezra Putranda [Mathematics Department, Gadjah Mada University (Indonesia)
2015-12-22
Indonesia is a country with high risk of earthquake, because of its position in the border of earth’s tectonic plate. An earthquake could raise very high amount of damage, loss, and other economic impacts. So, Indonesia needs a mechanism for transferring the risk of earthquake from the government or the (reinsurance) company, as it could collect enough money for implementing the rehabilitation and reconstruction program. One of the mechanisms is by issuing catastrophe bond, ‘act-of-God bond’, or simply CAT bond. A catastrophe bond issued by a special-purpose-vehicle (SPV) company, and then sold to the investor. The revenue from this transaction is joined with the money (premium) from the sponsor company and then invested in other product. If a catastrophe happened before the time-of-maturity, cash flow from the SPV to the investor will discounted or stopped, and the cash flow is paid to the sponsor company to compensate their loss because of this catastrophe event. When we consider the earthquake only, the amount of discounted cash flow could determine based on the earthquake’s magnitude. A case study with Indonesian earthquake magnitude data show that the probability of maximum magnitude can model by generalized extreme value (GEV) distribution. In pricing this catastrophe bond, we assumed stochastic interest rate that following the Cox-Ingersoll-Ross (CIR) interest rate model. We develop formulas for pricing three types of catastrophe bond, namely zero coupon bonds, ‘coupon only at risk’ bond, and ‘principal and coupon at risk’ bond. Relationship between price of the catastrophe bond and CIR model’s parameter, GEV’s parameter, percentage of coupon, and discounted cash flow rule then explained via Monte Carlo simulation.
Instrument uncertainty predictions
International Nuclear Information System (INIS)
Coutts, D.A.
1991-07-01