WorldWideScience

Sample records for extreme uncertainty simulating

  1. Parameter uncertainty in simulations of extreme precipitation and attribution studies.

    Science.gov (United States)

    Timmermans, B.; Collins, W. D.; O'Brien, T. A.; Risser, M. D.

    2017-12-01

    The attribution of extreme weather events, such as heavy rainfall, to anthropogenic influence involves the analysis of their probability in simulations of climate. The climate models used however, such as the Community Atmosphere Model (CAM), employ approximate physics that gives rise to "parameter uncertainty"—uncertainty about the most accurate or optimal values of numerical parameters within the model. In particular, approximate parameterisations for convective processes are well known to be influential in the simulation of precipitation extremes. Towards examining the impact of this source of uncertainty on attribution studies, we investigate the importance of components—through their associated tuning parameters—of parameterisations relating to deep and shallow convection, and cloud and aerosol microphysics in CAM. We hypothesise that as numerical resolution is increased the change in proportion of variance induced by perturbed parameters associated with the respective components is consistent with the decreasing applicability of the underlying hydrostatic assumptions. For example, that the relative influence of deep convection should diminish as resolution approaches that where convection can be resolved numerically ( 10 km). We quantify the relationship between the relative proportion of variance induced and numerical resolution by conducting computer experiments that examine precipitation extremes over the contiguous U.S. In order to mitigate the enormous computational burden of running ensembles of long climate simulations, we use variable-resolution CAM and employ both extreme value theory and surrogate modelling techniques ("emulators"). We discuss the implications of the relationship between parameterised convective processes and resolution both in the context of attribution studies and progression towards models that fully resolve convection.

  2. Future Simulated Intensification of Precipitation Extremes, CMIP5 Model Uncertainties and Dependencies

    Science.gov (United States)

    Bador, M.; Donat, M.; Geoffroy, O.; Alexander, L. V.

    2017-12-01

    Precipitation intensity during extreme events is expected to increase with climate change. Throughout the 21st century, CMIP5 climate models project a general increase in annual extreme precipitation in most regions. We investigate how robust this future increase is across different models, regions and seasons. We find that there is strong similarity in extreme precipitation changes between models that share atmospheric physics, reducing the ensemble of 27 models to 14 independent projections. We find that future simulated extreme precipitation increases in most models in the majority of land grid cells located in the dry, intermediate and wet regions according to each model's precipitation climatology. These increases significantly exceed the range of natural variability estimated from long equilibrium control runs. The intensification of extreme precipitation across the entire spectrum of dry to wet regions is particularly robust in the extra-tropics in both wet and dry season, whereas uncertainties are larger in the tropics. The CMIP5 ensemble therefore indicates robust future intensification of annual extreme rainfall in particular in extra-tropical regions. Generally, the CMIP5 robustness is higher during the dry season compared to the wet season and the annual scale, but inter-model uncertainties in the tropics remain important.

  3. Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Biros, George [Univ. of Texas, Austin, TX (United States)

    2018-01-12

    Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. These include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a

  4. Probabilistic Approach to Enable Extreme-Scale Simulations under Uncertainty and System Faults. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-05-05

    The current project develops a novel approach that uses a probabilistic description to capture the current state of knowledge about the computational solution. To effectively spread the computational effort over multiple nodes, the global computational domain is split into many subdomains. Computational uncertainty in the solution translates into uncertain boundary conditions for the equation system to be solved on those subdomains, and many independent, concurrent subdomain simulations are used to account for this bound- ary condition uncertainty. By relying on the fact that solutions on neighboring subdomains must agree with each other, a more accurate estimate for the global solution can be achieved. Statistical approaches in this update process make it possible to account for the effect of system faults in the probabilistic description of the computational solution, and the associated uncertainty is reduced through successive iterations. By combining all of these elements, the probabilistic reformulation allows splitting the computational work over very many independent tasks for good scalability, while being robust to system faults.

  5. Sketching Uncertainty into Simulations.

    Science.gov (United States)

    Ribicic, H; Waser, J; Gurbat, R; Sadransky, B; Groller, M E

    2012-12-01

    In a variety of application areas, the use of simulation steering in decision making is limited at best. Research focusing on this problem suggests that most user interfaces are too complex for the end user. Our goal is to let users create and investigate multiple, alternative scenarios without the need for special simulation expertise. To simplify the specification of parameters, we move from a traditional manipulation of numbers to a sketch-based input approach. Users steer both numeric parameters and parameters with a spatial correspondence by sketching a change onto the rendering. Special visualizations provide immediate visual feedback on how the sketches are transformed into boundary conditions of the simulation models. Since uncertainty with respect to many intertwined parameters plays an important role in planning, we also allow the user to intuitively setup complete value ranges, which are then automatically transformed into ensemble simulations. The interface and the underlying system were developed in collaboration with experts in the field of flood management. The real-world data they have provided has allowed us to construct scenarios used to evaluate the system. These were presented to a variety of flood response personnel, and their feedback is discussed in detail in the paper. The interface was found to be intuitive and relevant, although a certain amount of training might be necessary.

  6. Climate change impacts on extreme events in the United States: an uncertainty analysis

    Science.gov (United States)

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  7. Uncertainty and simulation

    International Nuclear Information System (INIS)

    Depres, B.; Dossantos-Uzarralde, P.

    2009-01-01

    More than 150 researchers and engineers from universities and the industrial world met to discuss on the new methodologies developed around assessing uncertainty. About 20 papers were presented and the main topics were: methods to study the propagation of uncertainties, sensitivity analysis, nuclear data covariances or multi-parameter optimisation. This report gathers the contributions of CEA researchers and engineers

  8. Bayesian analysis applied to statistical uncertainties of extreme response distributions of offshore wind turbines

    NARCIS (Netherlands)

    Cheng, P.W.; Kuik, van G.A.M.; Bussel, van G.J.W.; Vrouwenvelder, A.C.W.M.

    2002-01-01

    Extreme response is an important design variable for wind turbines. The statistical uncertainties concerning the extreme response distribution are simulated here with data concerning physical characteristics obtained from measurements. The extreme responses are the flap moment at the blade root and

  9. Uncertainties in extreme precipitation under climate change conditions

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia

    of adaptation strategies, but these changes are subject to uncertainties. The focus of this PhD thesis is the quantification of uncertainties in changes in extreme precipitation. It addresses two of the main sources of uncertainty in climate change impact studies: regional climate models (RCMs) and statistical...... downscaling methods (SDMs). RCMs provide information on climate change at the regional scale. SDMs are used to bias-correct and downscale the outputs of the RCMs to the local scale of interest in adaptation strategies. In the first part of the study, a multi-model ensemble of RCMs from the European ENSEMBLES...... project was used to quantify the uncertainty in RCM projections over Denmark. Three aspects of the RCMs relevant for the uncertainty quantification were first identified and investigated. These are: the interdependency of the RCMs; the performance in current climate; and the change in the performance...

  10. Uncertainty related to Environmental Data and Estimated Extreme Events

    DEFF Research Database (Denmark)

    Burcharth, H. F.

    The design loads on rubble mound breakwaters are almost entirely determined by the environmental conditions, i.e. sea state, water levels, sea bed characteristics, etc. It is the objective of sub-group B to identify the most important environmental parameters and evaluate the related uncertainties...... including those corresponding to extreme estimates typically used for design purposes. Basically a design condition is made up of a set of parameter values stemming from several environmental parameters. To be able to evaluate the uncertainty related to design states one must know the corresponding joint....... Consequently this report deals mainly with each parameter separately. Multi parameter problems are briefly discussed in section 9. It is important to notice that the quantified uncertainties reported in section 7.7 represent what might be regarded as typical figures to be used only when no more qualified...

  11. Uncertainty Quantification in Aerodynamics Simulations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of the proposed work (Phases I and II) is to develop uncertainty quantification methodologies and software suitable for use in CFD simulations of...

  12. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    Science.gov (United States)

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  13. Quantification of uncertainties of modeling and simulation

    International Nuclear Information System (INIS)

    Ma Zhibo; Yin Jianwei

    2012-01-01

    The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)

  14. Assessment of SFR Wire Wrap Simulation Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Delchini, Marc-Olivier G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Popov, Emilian L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Swiler, Laura P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-30

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results

  15. Uncertainties Related to Extreme Event Statistics of Sewer System Surcharge and Overflow

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Kjeld; Johansen, C.; Thorndahl, Søren Liedtke

    2005-01-01

    Today it is common practice - in the major part of Europe - to base design of sewer systems in urban areas on recommended minimum values of flooding frequencies related to either pipe top level, basement level in buildings or level of road surfaces. Thus storm water runoff in sewer systems is only...... proceeding in an acceptable manner, if flooding of these levels is having an average return period bigger than a predefined value. This practice is also often used in functional analysis of existing sewer systems. If a sewer system can fulfil recommended flooding frequencies or not, can only be verified...... by performing long term simulations - using a sewer flow simulation model - and draw up extreme event statistics from the model simulations. In this context it is important to realize that uncertainties related to the input parameters of rainfall runoff models will give rise to uncertainties related...

  16. Quantification of Uncertainty in Thermal Building Simulation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Haghighat, F.; Frier, Christian

    In order to quantify uncertainty in thermal building simulation stochastic modelling is applied on a building model. An application of stochastic differential equations is presented in Part 1 comprising a general heat balance for an arbitrary number of loads and zones in a building to determine...

  17. Extreme Events in China under Climate Change: Uncertainty and related impacts (CSSP-FOREX)

    Science.gov (United States)

    Leckebusch, Gregor C.; Befort, Daniel J.; Hodges, Kevin I.

    2016-04-01

    Suitable adaptation strategies or the timely initiation of related mitigation efforts in East Asia will strongly depend on robust and comprehensive information about future near-term as well as long-term potential changes in the climate system. Therefore, understanding the driving mechanisms associated with the East Asian climate is of major importance. The FOREX project (Fostering Regional Decision Making by the Assessment of Uncertainties of Future Regional Extremes and their Linkage to Global Climate System Variability for China and East Asia) focuses on the investigation of extreme wind and rainfall related events over Eastern Asia and their possible future changes. Here, analyses focus on the link between local extreme events and their driving weather systems. This includes the coupling between local rainfall extremes and tropical cyclones, the Meiyu frontal system, extra-tropical teleconnections and monsoonal activity. Furthermore, the relation between these driving weather systems and large-scale variability modes, e.g. NAO, PDO, ENSO is analysed. Thus, beside analysing future changes of local extreme events, the temporal variability of their driving weather systems and related large-scale variability modes will be assessed in current CMIP5 global model simulations to obtain more robust results. Beyond an overview of FOREX itself, first results regarding the link between local extremes and their steering weather systems based on observational and reanalysis data are shown. Special focus is laid on the contribution of monsoonal activity, tropical cyclones and the Meiyu frontal system on the inter-annual variability of the East Asian summer rainfall.

  18. Wind simulation for extreme and fatigue loads

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, M.; Larsen, G.C.; Mann, J.; Ott, S.; Hansen, K.S.; Pedersen, B.J.

    2004-01-01

    Measurements of atmospheric turbulence have been studied and found to deviate from a Gaussian process, in particular regarding the velocity increments over small time steps, where the tails of the pdf are exponential rather than Gaussian. Principles for extreme event counting and the occurrence of cascading events are presented. Empirical extreme statistics agree with Rices exceedence theory, when it is assumed that the velocity and its time derivative are independent. Prediction based on the assumption that the velocity is a Gaussian process underpredicts the rate of occurrence of extreme events by many orders of magnitude, mainly because the measured pdf is non-Gaussian. Methods for simulation of turbulent signals have been developed and their computational efficiency are considered. The methods are applicable for multiple processes with individual spectra and probability distributions. Non-Gaussian processes are simulated by the correlation-distortion method. Non-stationary processes are obtained by Bezier interpolation between a set of stationary simulations with identical random seeds. Simulation of systems with some signals available is enabled by conditional statistics. A versatile method for simulation of extreme events has been developed. This will generate gusts, velocity jumps, extreme velocity shears, and sudden changes of wind direction. Gusts may be prescribed with a specified ensemble average shape, and it is possible to detect the critical gust shape for a given construction. The problem is formulated as the variational problem of finding the most probable adjustment of a standard simulation of a stationary Gaussian process subject to relevant event conditions, which are formulated as linear combination of points in the realization. The method is generalized for multiple correlated series, multiple simultaneous conditions, and 3D fields of all velocity components. Generalization are presented for a single non-Gaussian process subject to relatively

  19. Structural Uncertainty in Antarctic sea ice simulations

    Science.gov (United States)

    Schneider, D. P.

    2016-12-01

    The inability of the vast majority of historical climate model simulations to reproduce the observed increase in Antarctic sea ice has motivated many studies about the quality of the observational record, the role of natural variability versus forced changes, and the possibility of missing or inadequate forcings in the models (such as freshwater discharge from thinning ice shelves or an inadequate magnitude of stratospheric ozone depletion). In this presentation I will highlight another source of uncertainty that has received comparatively little attention: Structural uncertainty, that is, the systematic uncertainty in simulated sea ice trends that arises from model physics and mean-state biases. Using two large ensembles of experiments from the Community Earth System Model (CESM), I will show that the model is predisposed towards producing negative Antarctic sea ice trends during 1979-present, and that this outcome is not simply because the model's decadal variability is out-of-synch with that in nature. In the "Tropical Pacific Pacemaker" ensemble, in which observed tropical Pacific SST anomalies are prescribed, the model produces very realistic atmospheric circulation trends over the Southern Ocean, yet the sea ice trend is negative in every ensemble member. However, if the ensemble-mean trend (commonly interpreted as the forced response) is removed, some ensemble members show a sea ice increase that is very similar to the observed. While this results does confirm the important role of natural variability, it also suggests a strong bias in the forced response. I will discuss the reasons for this systematic bias and explore possible remedies. This an important problem to solve because projections of 21st -Century changes in the Antarctic climate system (including ice sheet surface mass balance changes and related changes in the sea level budget) have a strong dependence on the mean state of and changes in the Antarctic sea ice cover. This problem is not unique to

  20. Regional climate change trends and uncertainty analysis using extreme indices: A case study of Hamilton, Canada

    OpenAIRE

    Razavi, Tara; Switzman, Harris; Arain, Altaf; Coulibaly, Paulin

    2016-01-01

    This study aims to provide a deeper understanding of the level of uncertainty associated with the development of extreme weather frequency and intensity indices at the local scale. Several different global climate models, downscaling methods, and emission scenarios were used to develop extreme temperature and precipitation indices at the local scale in the Hamilton region, Ontario, Canada. Uncertainty associated with historical and future trends in extreme indices and future climate projectio...

  1. Simulations of nearly extremal binary black holes

    Science.gov (United States)

    Giesler, Matthew; Scheel, Mark; Hemberger, Daniel; Lovelace, Geoffrey; Kuper, Kevin; Boyle, Michael; Szilagyi, Bela; Kidder, Lawrence; SXS Collaboration

    2015-04-01

    Astrophysical black holes could have nearly extremal spins; therefore, nearly extremal black holes could be among the binaries that current and future gravitational-wave observatories will detect. Predicting the gravitational waves emitted by merging black holes requires numerical-relativity simulations, but these simulations are especially challenging when one or both holes have mass m and spin S exceeding the Bowen-York limit of S /m2 = 0 . 93 . Using improved methods we simulate an unequal-mass, precessing binary black hole coalescence, where the larger black hole has S /m2 = 0 . 99 . We also use these methods to simulate a nearly extremal non-precessing binary black hole coalescence, where both black holes have S /m2 = 0 . 994 , nearly reaching the Novikov-Thorne upper bound for holes spun up by thin accretion disks. We demonstrate numerical convergence and estimate the numerical errors of the waveforms; we compare numerical waveforms from our simulations with post-Newtonian and effective-one-body waveforms; and we compare the evolution of the black-hole masses and spins with analytic predictions.

  2. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Conrad, Patrick [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Bigoni, Daniele [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Parno, Matthew [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2017-06-09

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a history of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT

  3. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  4. Are needs to manage uncertainty and threat associated with political conservatism or ideological extremity?

    Science.gov (United States)

    Jost, John T; Napier, Jaime L; Thorisdottir, Hulda; Gosling, Samuel D; Palfai, Tibor P; Ostafin, Brian

    2007-07-01

    Three studies are conducted to assess the uncertainty- threat model of political conservatism, which posits that psychological needs to manage uncertainty and threat are associated with political orientation. Results from structural equation models provide consistent support for the hypothesis that uncertainty avoidance (e.g., need for order, intolerance of ambiguity, and lack of openness to experience) and threat management (e.g., death anxiety, system threat, and perceptions of a dangerous world) each contributes independently to conservatism (vs. liberalism). No support is obtained for alternative models, which predict that uncertainty and threat management are associated with ideological extremism or extreme forms of conservatism only. Study 3 also reveals that resistance to change fully mediates the association between uncertainty avoidance and conservatism, whereas opposition to equality partially mediates the association between threat and conservatism. Implications for understanding the epistemic and existential bases of political orientation are discussed.

  5. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Ghanem, Roger [Univ. of Southern California, Los Angeles, CA (United States)

    2017-04-18

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced models to be used in estimation and inference.

  6. Uncertainties in hydrological extremes projections and its effects on decision-making processes in an Amazonian sub-basin.

    Science.gov (United States)

    Andres Rodriguez, Daniel; Garofolo, Lucas; Lazaro Siqueira Junior, Jose

    2013-04-01

    Uncertainties in Climate Change projections are affected by irreducible uncertainties due to knowledge's limitations, chaotic nature of climate system and human decision-making process. Such uncertainties affect the impact studies, complicating the decision-making process aimed at mitigation and adaptation. However, these uncertainties allow the possibility to develop exploratory analyses on system's vulnerability to different sceneries. Through these kinds of analyses it is possible to identify critical issues, which must be deeper studied. For this study we used several future's projections from General Circulation Models to feed a Hydrological Model, applied to the Amazonian sub-basin of Ji-Paraná. Hydrological Model integrations are performed for present historical time (1970-1990) and for future period (2010-2100). Extreme values analyses are performed to each simulated time series and results are compared with extremes events in present time. A simple approach to identify potential vulnerabilities consists of evaluating the hydrologic system response to climate variability and extreme events observed in the past, comparing them with the conditions projected for the future. Thus it is possible to identify critical issues that need attention and more detailed studies. For the goal of this work, we used socio-economic data from Brazilian Institute of Geography and Statistics, the Operator of the National Electric System, the Brazilian National Water Agency and scientific and press published information. This information is used to characterize impacts associated to extremes hydrological events in the basin during the present historical time and to evaluate potential impacts in the future face to the different hydrological projections. Results show inter-model variability results in a broad dispersion on projected extreme's values. The impact of such dispersion is differentiated for different aspects of socio-economic and natural systems and must be carefully

  7. Fusing Simulation Results From Multifidelity Aero-servo-elastic Simulators - Application To Extreme Loads On Wind Turbine

    DEFF Research Database (Denmark)

    Abdallah, Imad; Sudret, Bruno; Lataniotis, Christos

    2015-01-01

    Fusing predictions from multiple simulators in the early stages of the conceptual design of a wind turbine results in reduction in model uncertainty and risk mitigation. Aero-servo-elastic is a term that refers to the coupling of wind inflow, aerodynamics, structural dynamics and controls. Fusing...... the response data from multiple aero-servo-elastic simulators could provide better predictive ability than using any single simulator. The co-Kriging approach to fuse information from multifidelity aero-servo-elastic simulators is presented. We illustrate the co-Kriging approach to fuse the extreme flapwise...... bending moment at the blade root of a large wind turbine as a function of wind speed, turbulence and shear exponent in the presence of model uncertainty and non-stationary noise in the output. The extreme responses are obtained by two widely accepted numerical aero-servo-elastic simulators, FAST...

  8. Assessing uncertainty in extreme events: Applications to risk-based decision making in interdependent infrastructure sectors

    International Nuclear Information System (INIS)

    Barker, Kash; Haimes, Yacov Y.

    2009-01-01

    Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input-output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences

  9. Systematic uncertainties on Monte Carlo simulation of lead based ADS

    International Nuclear Information System (INIS)

    Embid, M.; Fernandez, R.; Garcia-Sanz, J.M.; Gonzalez, E.

    1999-01-01

    Computer simulations of the neutronic behaviour of ADS systems foreseen for actinide and fission product transmutation are affected by many sources of systematic uncertainties, both from the nuclear data and by the methodology selected when applying the codes. Several actual ADS Monte Carlo simulations are presented, comparing different options both for the data and for the methodology, evaluating the relevance of the different uncertainties. (author)

  10. Uncertainty in simulating wheat yields under climate change

    DEFF Research Database (Denmark)

    Asseng, A; Ewert, F; Rosenzweig, C

    2013-01-01

    of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models...... than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi...

  11. Uncertainties in the simulation of groundwater recharge at different scales

    Directory of Open Access Journals (Sweden)

    H. Bogena

    2005-01-01

    Full Text Available Digital spatial data always imply some kind of uncertainty. The source of this uncertainty can be found in their compilation as well as the conceptual design that causes a more or less exact abstraction of the real world, depending on the scale under consideration. Within the framework of hydrological modelling, in which numerous data sets from diverse sources of uneven quality are combined, the various uncertainties are accumulated. In this study, the GROWA model is taken as an example to examine the effects of different types of uncertainties on the calculated groundwater recharge. Distributed input errors are determined for the parameters' slope and aspect using a Monte Carlo approach. Landcover classification uncertainties are analysed by using the conditional probabilities of a remote sensing classification procedure. The uncertainties of data ensembles at different scales and study areas are discussed. The present uncertainty analysis showed that the Gaussian error propagation method is a useful technique for analysing the influence of input data on the simulated groundwater recharge. The uncertainties involved in the land use classification procedure and the digital elevation model can be significant in some parts of the study area. However, for the specific model used in this study it was shown that the precipitation uncertainties have the greatest impact on the total groundwater recharge error.

  12. Uncertainty in Simulating Wheat Yields Under Climate Change

    Science.gov (United States)

    Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.; hide

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.

  13. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  14. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    Science.gov (United States)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily

  15. The use of sequential indicator simulation to characterize geostatistical uncertainty

    International Nuclear Information System (INIS)

    Hansen, K.M.

    1992-10-01

    Sequential indicator simulation (SIS) is a geostatistical technique designed to aid in the characterization of uncertainty about the structure or behavior of natural systems. This report discusses a simulation experiment designed to study the quality of uncertainty bounds generated using SIS. The results indicate that, while SIS may produce reasonable uncertainty bounds in many situations, factors like the number and location of available sample data, the quality of variogram models produced by the user, and the characteristics of the geologic region to be modeled, can all have substantial effects on the accuracy and precision of estimated confidence limits. It is recommended that users of SIS conduct validation studies for the technique on their particular regions of interest before accepting the output uncertainty bounds

  16. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bhat, Kabekode Ghanasham [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  17. Uncertainty in Simulating Wheat Yields Under Climate Change

    Energy Technology Data Exchange (ETDEWEB)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O' Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.

    2013-09-01

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.

  18. Properties of Extreme Precipitation and Their Uncertainties in 3-year GPM Precipitation Radar Data

    Science.gov (United States)

    Liu, N.; Liu, C.

    2017-12-01

    Extreme high precipitation rates are often related to flash floods and have devastating impacts on human society and the environments. To better understand these rare events, 3-year Precipitation Features (PFs) are defined by grouping the contiguous areas with nonzero near-surface precipitation derived using Global Precipitation Measurement (GPM) Ku band Precipitation Radar (KuPR). The properties of PFs with extreme precipitation rates greater than 20, 50, 100 mm/hr, such as the geographical distribution, volumetric precipitation contribution, seasonal and diurnal variations, are examined. In addition to the large seasonal and regional variations, the rare extreme precipitation rates often have a larger contribution to the local total precipitation. Extreme precipitation rates occur more often over land than over ocean. The challenges in the retrieval of extreme precipitation might be from the attenuation correction and large uncertainties in the Z-R relationships from near-surface radar reflectivity to precipitation rates. These potential uncertainties are examined by using collocated ground based radar reflectivity and precipitation retrievals.

  19. Uncertainties of flood frequency estimation approaches based on continuous simulation using data resampling

    Science.gov (United States)

    Arnaud, Patrick; Cantet, Philippe; Odry, Jean

    2017-11-01

    Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with

  20. Predicting the Extreme Loads on a Wind Turbine Considering Uncertainty in Airfoil Data

    DEFF Research Database (Denmark)

    Abdallah, Imad; Natarajan, Anand; Sørensen, John Dalsgaard

    2014-01-01

    The sources contributing to uncertainty in a wind turbine blade static airfoil data include wind tunnel testing, CFD calculations, 3D rotational corrections based on CFD or emprircal models, surface roughness corrections, Reynolds number corrections, expansion to the full 360-degree angle of attack...... range, validation by full scale measurements, and geometric distortions of the blade during manufacturing and under loading. In this paper a stochastic model of the static airfoil data is proposed to supplement the prediction of extreme loads effects for large wind turbines. It is shown...... that the uncertainty in airfoil data can have e significant impact on the prediction of extreme loads effects depending on the component, and the correlation along the span of the blade....

  1. Assessment of Observational Uncertainty in Extreme Precipitation Events over the Continental United States

    Science.gov (United States)

    Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.; Goodman, A.

    2017-12-01

    Extreme precipitation events are associated with numerous societal and environmental impacts. Furthermore, anthropogenic climate change is projected to alter precipitation intensity across portions of the Continental United States (CONUS). Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. Towards this end, we apply an event-based indicator, developed as a part of NASA's support of the ongoing efforts of the US National Climate Assessment, which assigns categories to extreme precipitation events based on 3-day storm totals as a basis for dataset intercomparison. To assess observational uncertainty across a wide range of historical precipitation measurement approaches, we intercompare in situ station data from the Global Historical Climatology Network (GHCN), satellite-derived precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM), gridded in situ station data from the Parameter-elevation Regressions on Independent Slopes Model (PRISM), global reanalysis from NASA's Modern Era Retrospective-Analysis version 2 (MERRA 2), and regional reanalysis with gauge data assimilation from NCEP's North American Regional Reanalysis (NARR). Results suggest considerable variability across the five-dataset suite in the frequency, spatial extent, and magnitude of extreme precipitation events. Consistent with expectations, higher resolution datasets were found to resemble station data best and capture a greater frequency of high-end extreme events relative to lower spatial resolution datasets. The degree of dataset agreement varies regionally, however all datasets successfully capture the seasonal cycle of precipitation extremes across the CONUS. These intercomparison results provide additional insight about observational uncertainty and the ability of a range of precipitation measurement and analysis products to capture extreme precipitation event climatology. While the event category threshold is fixed

  2. Statistical uncertainty of extreme wind storms over Europe derived from a probabilistic clustering technique

    Science.gov (United States)

    Walz, Michael; Leckebusch, Gregor C.

    2016-04-01

    Extratropical wind storms pose one of the most dangerous and loss intensive natural hazards for Europe. However, due to only 50 years of high quality observational data, it is difficult to assess the statistical uncertainty of these sparse events just based on observations. Over the last decade seasonal ensemble forecasts have become indispensable in quantifying the uncertainty of weather prediction on seasonal timescales. In this study seasonal forecasts are used in a climatological context: By making use of the up to 51 ensemble members, a broad and physically consistent statistical base can be created. This base can then be used to assess the statistical uncertainty of extreme wind storm occurrence more accurately. In order to determine the statistical uncertainty of storms with different paths of progression, a probabilistic clustering approach using regression mixture models is used to objectively assign storm tracks (either based on core pressure or on extreme wind speeds) to different clusters. The advantage of this technique is that the entire lifetime of a storm is considered for the clustering algorithm. Quadratic curves are found to describe the storm tracks most accurately. Three main clusters (diagonal, horizontal or vertical progression of the storm track) can be identified, each of which have their own particulate features. Basic storm features like average velocity and duration are calculated and compared for each cluster. The main benefit of this clustering technique, however, is to evaluate if the clusters show different degrees of uncertainty, e.g. more (less) spread for tracks approaching Europe horizontally (diagonally). This statistical uncertainty is compared for different seasonal forecast products.

  3. Wave Energy Converter Annual Energy Production Uncertainty Using Simulations

    Directory of Open Access Journals (Sweden)

    Clayton E. Hiles

    2016-09-01

    Full Text Available Critical to evaluating the economic viability of a wave energy project is: (1 a robust estimate of the electricity production throughout the project lifetime and (2 an understanding of the uncertainty associated with said estimate. Standardization efforts have established mean annual energy production (MAEP as the metric for quantification of wave energy converter (WEC electricity production and the performance matrix approach as the appropriate method for calculation. General acceptance of a method for calculating the MAEP uncertainty has not yet been achieved. Several authors have proposed methods based on the standard engineering approach to error propagation, however, a lack of available WEC deployment data has restricted testing of these methods. In this work the magnitude and sensitivity of MAEP uncertainty is investigated. The analysis is driven by data from simulated deployments of 2 WECs of different operating principle at 4 different locations. A Monte Carlo simulation approach is proposed for calculating the variability of MAEP estimates and is used to explore the sensitivity of the calculation. The uncertainty of MAEP ranged from 2%–20% of the mean value. Of the contributing uncertainties studied, the variability in the wave climate was found responsible for most of the uncertainty in MAEP. Uncertainty in MAEP differs considerably between WEC types and between deployment locations and is sensitive to the length of the input data-sets. This implies that if a certain maximum level of uncertainty in MAEP is targeted, the minimum required lengths of the input data-sets will be different for every WEC-location combination.

  4. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  5. Measurement, simulation and uncertainty assessment of implant heating during MRI

    International Nuclear Information System (INIS)

    Neufeld, E; Kuehn, S; Kuster, N; Szekely, G

    2009-01-01

    The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.

  6. Measurement, simulation and uncertainty assessment of implant heating during MRI

    Energy Technology Data Exchange (ETDEWEB)

    Neufeld, E; Kuehn, S; Kuster, N [Foundation for Research on Information Technologies in Society (IT' IS), Zeughausstr. 43, 8004 Zurich (Switzerland); Szekely, G [Computer Vision Laboratory, Swiss Federal Institute of Technology (ETHZ), Sternwartstr 7, ETH Zentrum, 8092 Zurich (Switzerland)], E-mail: neufeld@itis.ethz.ch

    2009-07-07

    The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.

  7. Effect of monthly areal rainfall uncertainty on streamflow simulation

    Science.gov (United States)

    Ndiritu, J. G.; Mkhize, N.

    2017-08-01

    Areal rainfall is mostly obtained from point rainfall measurements that are sparsely located and several studies have shown that this results in large areal rainfall uncertainties at the daily time step. However, water resources assessment is often carried out a monthly time step and streamflow simulation is usually an essential component of this assessment. This study set out to quantify monthly areal rainfall uncertainties and assess their effect on streamflow simulation. This was achieved by; i) quantifying areal rainfall uncertainties and using these to generate stochastic monthly areal rainfalls, and ii) finding out how the quality of monthly streamflow simulation and streamflow variability change if stochastic areal rainfalls are used instead of historic areal rainfalls. Tests on monthly rainfall uncertainty were carried out using data from two South African catchments while streamflow simulation was confined to one of them. A non-parametric model that had been applied at a daily time step was used for stochastic areal rainfall generation and the Pitman catchment model calibrated using the SCE-UA optimizer was used for streamflow simulation. 100 randomly-initialised calibration-validation runs using 100 stochastic areal rainfalls were compared with 100 runs obtained using the single historic areal rainfall series. By using 4 rain gauges alternately to obtain areal rainfall, the resulting differences in areal rainfall averaged to 20% of the mean monthly areal rainfall and rainfall uncertainty was therefore highly significant. Pitman model simulations obtained coefficient of efficiencies averaging 0.66 and 0.64 in calibration and validation using historic rainfalls while the respective values using stochastic areal rainfalls were 0.59 and 0.57. Average bias was less than 5% in all cases. The streamflow ranges using historic rainfalls averaged to 29% of the mean naturalised flow in calibration and validation and the respective average ranges using stochastic

  8. Uncertainty of input data for room acoustic simulations

    DEFF Research Database (Denmark)

    Jeong, Cheol-Ho; Marbjerg, Gerd; Brunskog, Jonas

    2016-01-01

    Although many room acoustic simulation models have been well established, simulation results will never be accurate with inaccurate and uncertain input data. This study addresses inappropriateness and uncertainty of input data for room acoustic simulations. Firstly, the random incidence absorption...... and scattering coefficients are insufficient when simulating highly non-diffuse rooms. More detailed information, such as the phase and angle dependence, can greatly improve the simulation results of pressure-based geometrical and wave-based models at frequencies well below the Schroeder frequency. Phase...... summarizes potential advanced absorption measurement techniques that can improve the quality of input data for room acoustic simulations. Lastly, plenty of uncertain input data are copied from unreliable sources. Software developers and users should be careful when spreading such uncertain input data. More...

  9. Parameter Uncertainty on AGCM-simulated Tropical Cyclones

    Science.gov (United States)

    He, F.

    2015-12-01

    This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.

  10. Quantifying chemical uncertainties in simulations of the ISM

    Science.gov (United States)

    Glover, Simon

    2018-06-01

    The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.

  11. Risk Formulations versus Comprehensive Uncertainty Characterizations for Climate Extremes and their Impacts

    Science.gov (United States)

    Parish, E. S.; Ganguly, A. R.

    2009-12-01

    Climate extremes—defined inclusively as extreme hydro-metrological events and regional changes in climate patterns at decadal scales—and their impacts on natural, engineered or human systems, represent among the most significant knowledge-gaps in climate prediction and integrated assessments in a post-AR4 world. Risks and uncertainties are related but distinct concepts. However, their relevance to decision-support tools in the context of climate change is indisputable. The opportunities and challenges are presented with case studies developed for stakeholders and policy makers.

  12. Decision strategies for handling the uncertainty of future extreme rainfall under the influence of climate change

    DEFF Research Database (Denmark)

    Gregersen, Ida Bülow; Arnbjerg-Nielsen, Karsten

    2012-01-01

    Several extraordinary rainfall events have occurred in Denmark within the last few years. For each event, problems in urban areas occurred as the capacity of the existing drainage systems were exceeded. Adaptation to climate change is necessary but also very challenging as urban drainage systems...... are characterized by long technical lifetimes and high, unrecoverable construction costs. One of the most important barriers for the initiation and implementation of the adaptation strategies is therefore the uncertainty when predicting the magnitude of the extreme rainfall in the future. This challenge is explored...

  13. Effects of climate model interdependency on the uncertainty quantification of extreme rainfall projections

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Madsen, H.; Rosbjerg, Dan

    Climate Models (RCMs) and General Circulation Models (GCMs). These multi-model ensembles provide the information needed to estimate probabilistic climate change projections. Several probabilistic methods have been suggested. One common assumption in most of these methods is that the climate models...... are independent. The effects of this assumption on the uncertainty quantification of extreme rainfall projections are addressed in this study. First, the interdependency of the 95% quantile of wet days in the ENSEMBLES RCMs is estimated. For this statistic and the region studied, the RCMs cannot be assumed...

  14. Wind Simulation for Extreme and Fatigue Loads

    DEFF Research Database (Denmark)

    Nielsen, Morten; Larsen, Gunner Chr.; Mann, Jakob

    2003-01-01

    by many orders of magnitude, mainly because the measured pdf is non-Gaussian. Methods for simulation of turbulent signals have been developed and theircomputational efficiency are considered. The methods are applicable for multiple processes with individual spectra and probability distributions. Non...... is formulated as the variational problem of finding the most probable adjustment of a standard simulation of a stationary Gaussian process subject to relevant event conditions, which are formulated as linear combination of pointsin the realization. The method is generalized for multiple correlated series......Measurements of atmospheric turbulence have been studied and found to deviate from a Gaussian process, in particular regarding the velocity increments over small time steps, where the tails of the pdf are exponential rather than Gaussian. Principles forextreme event counting and the occurrence...

  15. Observed and simulated temperature extremes during the recent warming hiatus

    International Nuclear Information System (INIS)

    Sillmann, Jana; Donat, Markus G; Fyfe, John C; Zwiers, Francis W

    2014-01-01

    The discrepancy between recent observed and simulated trends in global mean surface temperature has provoked a debate about possible causes and implications for future climate change projections. However, little has been said in this discussion about observed and simulated trends in global temperature extremes. Here we assess trend patterns in temperature extremes and evaluate the consistency between observed and simulated temperature extremes over the past four decades (1971–2010) in comparison to the recent 15 years (1996–2010). We consider the coldest night and warmest day in a year in the observational dataset HadEX2 and in the current generation of global climate models (CMIP5). In general, the observed trends fall within the simulated range of trends, with better consistency for the longer period. Spatial trend patterns differ for the warm and cold extremes, with the warm extremes showing continuous positive trends across the globe and the cold extremes exhibiting a coherent cooling pattern across the Northern Hemisphere mid-latitudes that has emerged in the recent 15 years and is not reproduced by the models. This regional inconsistency between models and observations might be a key to understanding the recent hiatus in global mean temperature warming. (letters)

  16. Uncertainty quantification in ion–solid interaction simulations

    Energy Technology Data Exchange (ETDEWEB)

    Preuss, R., E-mail: preuss@ipp.mpg.de; Toussaint, U. von

    2017-02-15

    Within the framework of Bayesian uncertainty quantification we propose a non-intrusive reduced-order spectral approach (polynomial chaos expansion) to the simulation of ion–solid interactions. The method not only reduces the number of function evaluations but provides simultaneously a quantitative measure for which combinations of inputs have the most important impact on the result. It is applied to SDTRIM-simulations (Möller et al., 1988) with several uncertain and Gaussian distributed input parameters (i.e. angle, projectile energy, surface binding energy, target composition) and the results are compared to full-grid based approaches and sampling based methods with respect to reliability, efficiency and scalability.

  17. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  18. Optimizing Grippers for Compensating Pose Uncertainties by Dynamic Simulation

    DEFF Research Database (Denmark)

    Wolniakowski, Adam; Kramberger, Aljaž; Gams, Andrej

    2017-01-01

    Gripper design process is one of the interesting challenges in the context of grasping within industry. Typically, simple parallel-finger grippers, which are easy to install and maintain, are used in platforms for robotic grasping. The context switches in these platforms require frequent exchange......, we have presented a method to automatically compute the optimal finger shapes for defined task contexts in simulation. In this paper, we show the performance of our method in an industrial grasping scenario. We first analyze the uncertainties of the used vision system, which are the major source...

  19. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    Science.gov (United States)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-02-01

    We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135 × 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101 × 104 km2). However the uncertainty (1 to 128 × 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future permafrost distribution can be made for

  20. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Xiu, Dongbin [Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  1. A Network Contention Model for the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2015-01-01

    The Extreme-scale Simulator (xSim) is a performance investigation toolkit for high-performance computing (HPC) hardware/software co-design. It permits running a HPC application with millions of concurrent execution threads, while observing its performance in a simulated extreme-scale system. This paper details a newly developed network modeling feature for xSim, eliminating the shortcomings of the existing network modeling capabilities. The approach takes a different path for implementing network contention and bandwidth capacity modeling using a less synchronous and accurate enough model design. With the new network modeling feature, xSim is able to simulate on-chip and on-node networks with reasonable accuracy and overheads.

  2. An Indirect Simulation-Optimization Model for Determining Optimal TMDL Allocation under Uncertainty

    Directory of Open Access Journals (Sweden)

    Feng Zhou

    2015-11-01

    Full Text Available An indirect simulation-optimization model framework with enhanced computational efficiency and risk-based decision-making capability was developed to determine optimal total maximum daily load (TMDL allocation under uncertainty. To convert the traditional direct simulation-optimization model into our indirect equivalent model framework, we proposed a two-step strategy: (1 application of interval regression equations derived by a Bayesian recursive regression tree (BRRT v2 algorithm, which approximates the original hydrodynamic and water-quality simulation models and accurately quantifies the inherent nonlinear relationship between nutrient load reductions and the credible interval of algal biomass with a given confidence interval; and (2 incorporation of the calibrated interval regression equations into an uncertain optimization framework, which is further converted to our indirect equivalent framework by the enhanced-interval linear programming (EILP method and provides approximate-optimal solutions at various risk levels. The proposed strategy was applied to the Swift Creek Reservoir’s nutrient TMDL allocation (Chesterfield County, VA to identify the minimum nutrient load allocations required from eight sub-watersheds to ensure compliance with user-specified chlorophyll criteria. Our results indicated that the BRRT-EILP model could identify critical sub-watersheds faster than the traditional one and requires lower reduction of nutrient loadings compared to traditional stochastic simulation and trial-and-error (TAE approaches. This suggests that our proposed framework performs better in optimal TMDL development compared to the traditional simulation-optimization models and provides extreme and non-extreme tradeoff analysis under uncertainty for risk-based decision making.

  3. Propagation of radar rainfall uncertainty in urban flood simulations

    Science.gov (United States)

    Liguori, Sara; Rico-Ramirez, Miguel

    2013-04-01

    hydrodynamic sewer network model implemented in the Infoworks software was used to model the rainfall-runoff process in the urban area. The software calculates the flow through the sewer conduits of the urban model using rainfall as the primary input. The sewer network is covered by 25 radar pixels with a spatial resolution of 1 km2. The majority of the sewer system is combined, carrying both urban rainfall runoff as well as domestic and trade waste water [11]. The urban model was configured to receive the probabilistic radar rainfall fields. The results showed that the radar rainfall ensembles provide additional information about the uncertainty in the radar rainfall measurements that can be propagated in urban flood modelling. The peaks of the measured flow hydrographs are often bounded within the uncertainty area produced by using the radar rainfall ensembles. This is in fact one of the benefits of using radar rainfall ensembles in urban flood modelling. More work needs to be done in improving the urban models, but this is out of the scope of this research. The rainfall uncertainty cannot explain the whole uncertainty shown in the flow simulations, and additional sources of uncertainty will come from the structure of the urban models as well as the large number of parameters required by these models. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and the UK Environment Agency for providing the various data sets. We also thank Yorkshire Water Services Ltd for providing the urban model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1. References [1] Browning KA, 1978. Meteorological applications of radar. Reports on Progress in Physics 41 761 Doi: 10.1088/0034-4885/41/5/003 [2] Rico-Ramirez MA, Cluckie ID, Shepherd G, Pallot A, 2007. A high-resolution radar experiment on the island of Jersey. Meteorological Applications 14: 117-129. [3] Villarini G, Krajewski WF

  4. An Uncertainty Structure Matrix for Models and Simulations

    Science.gov (United States)

    Green, Lawrence L.; Blattnig, Steve R.; Hemsch, Michael J.; Luckring, James M.; Tripathi, Ram K.

    2008-01-01

    Software that is used for aerospace flight control and to display information to pilots and crew is expected to be correct and credible at all times. This type of software is typically developed under strict management processes, which are intended to reduce defects in the software product. However, modeling and simulation (M&S) software may exhibit varying degrees of correctness and credibility, depending on a large and complex set of factors. These factors include its intended use, the known physics and numerical approximations within the M&S, and the referent data set against which the M&S correctness is compared. The correctness and credibility of an M&S effort is closely correlated to the uncertainty management (UM) practices that are applied to the M&S effort. This paper describes an uncertainty structure matrix for M&S, which provides a set of objective descriptions for the possible states of UM practices within a given M&S effort. The columns in the uncertainty structure matrix contain UM elements or practices that are common across most M&S efforts, and the rows describe the potential levels of achievement in each of the elements. A practitioner can quickly look at the matrix to determine where an M&S effort falls based on a common set of UM practices that are described in absolute terms that can be applied to virtually any M&S effort. The matrix can also be used to plan those steps and resources that would be needed to improve the UM practices for a given M&S effort.

  5. Simulation codes and the impact of validation/uncertainty requirements

    International Nuclear Information System (INIS)

    Sills, H.E.

    1995-01-01

    Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)

  6. Simulation of Extreme Arctic Cyclones in IPCC AR5 Experiments

    Science.gov (United States)

    Vavrus, S. J.

    2012-12-01

    Although impending Arctic climate change is widely recognized, a wild card in its expression is how extreme weather events in this region will respond to greenhouse warming. Intense polar cyclones represent one type of high-latitude phenomena falling into this category, including very deep synoptic-scale cyclones and mesoscale polar lows. These systems inflict damage through high winds, heavy precipitation, and wave action along coastlines, and their impact is expected to expand in the future, when reduced sea ice cover allows enhanced wave energy. The loss of a buffering ice pack could greatly increase the rate of coastal erosion, which has already been increasing in the Arctic. These and related threats may amplify if extreme Arctic cyclones become more frequent and/or intense in a warming climate with much more open water to fuel them. This possibility has merit on the basis of GCM experiments, which project that greenhouse forcing causes lower mean sea level pressure (SLP) in the Arctic and a strengthening of the deepest storms over boreal high latitudes. In this study, the latest Coupled Model Intercomparison Project (CMIP5) climate model output is used to investigate the following questions: (1) What are the spatial and seasonal characteristics of extreme Arctic cyclones? (2) How well do GCMs simulate these phenomena? (3) Are Arctic cyclones already showing the expected response to greenhouse warming in climate models? To address these questions, a retrospective analysis is conducted of the transient 20th century simulations among the CMIP5 GCMs (spanning years 1850-2005). The results demonstrate that GCMs are able to reasonably represent extreme Arctic cyclones and that the simulated characteristics do not depend significantly on model resolution. Consistent with observational evidence, climate models generate these storms primarily during winter and within the climatological Aleutian and Icelandic Low regions. Occasionally the cyclones remain very intense

  7. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    Science.gov (United States)

    Wang, A.; Moore, J.C.; Cui, Xingquan; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D.M.; McGuire, A.D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-01-01

     We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135  ×  104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101  × 104 km2). However the uncertainty (1 to 128  ×  104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future

  8. Improving the Performance of the Extreme-scale Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Engelmann, Christian [ORNL; Naughton III, Thomas J [ORNL

    2014-01-01

    Investigating the performance of parallel applications at scale on future high-performance computing (HPC) architectures and the performance impact of different architecture choices is an important component of HPC hardware/software co-design. The Extreme-scale Simulator (xSim) is a simulation-based toolkit for investigating the performance of parallel applications at scale. xSim scales to millions of simulated Message Passing Interface (MPI) processes. The overhead introduced by a simulation tool is an important performance and productivity aspect. This paper documents two improvements to xSim: (1) a new deadlock resolution protocol to reduce the parallel discrete event simulation management overhead and (2) a new simulated MPI message matching algorithm to reduce the oversubscription management overhead. The results clearly show a significant performance improvement, such as by reducing the simulation overhead for running the NAS Parallel Benchmark suite inside the simulator from 1,020\\% to 238% for the conjugate gradient (CG) benchmark and from 102% to 0% for the embarrassingly parallel (EP) and benchmark, as well as, from 37,511% to 13,808% for CG and from 3,332% to 204% for EP with accurate process failure simulation.

  9. Evaluation of uncertainty in capturing the spatial variability and magnitudes of extreme hydrological events for the uMngeni catchment, South Africa

    Science.gov (United States)

    Kusangaya, Samuel; Warburton Toucher, Michele L.; van Garderen, Emma Archer

    2018-02-01

    Downscaled General Circulation Models (GCMs) output are used to forecast climate change and provide information used as input for hydrological modelling. Given that our understanding of climate change points towards an increasing frequency, timing and intensity of extreme hydrological events, there is therefore the need to assess the ability of downscaled GCMs to capture these extreme hydrological events. Extreme hydrological events play a significant role in regulating the structure and function of rivers and associated ecosystems. In this study, the Indicators of Hydrologic Alteration (IHA) method was adapted to assess the ability of simulated streamflow (using downscaled GCMs (dGCMs)) in capturing extreme river dynamics (high and low flows), as compared to streamflow simulated using historical climate data from 1960 to 2000. The ACRU hydrological model was used for simulating streamflow for the 13 water management units of the uMngeni Catchment, South Africa. Statistically downscaled climate models obtained from the Climate System Analysis Group at the University of Cape Town were used as input for the ACRU Model. Results indicated that, high flows and extreme high flows (one in ten year high flows/large flood events) were poorly represented both in terms of timing, frequency and magnitude. Simulated streamflow using dGCMs data also captures more low flows and extreme low flows (one in ten year lowest flows) than that captured in streamflow simulated using historical climate data. The overall conclusion was that although dGCMs output can reasonably be used to simulate overall streamflow, it performs poorly when simulating extreme high and low flows. Streamflow simulation from dGCMs must thus be used with caution in hydrological applications, particularly for design hydrology, as extreme high and low flows are still poorly represented. This, arguably calls for the further improvement of downscaling techniques in order to generate climate data more relevant and

  10. Uncertainties of the 50-year wind from short time series using generalized extreme value distribution and generalized Pareto distribution

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Mann, Jakob; Rathmann, Ole

    2015-01-01

    This study examines the various sources to the uncertainties in the application of two widely used extreme value distribution functions, the generalized extreme value distribution (GEVD) and the generalized Pareto distribution (GPD). The study is done through the analysis of measurements from...... as a guideline for applying GEVD and GPD to wind time series of limited length. The data analysis shows that, with reasonable choice of relevant parameters, GEVD and GPD give consistent estimates of the return winds. For GEVD, the base period should be chosen in accordance with the occurrence of the extreme wind...

  11. Simulating merging binary black holes with nearly extremal spins

    International Nuclear Information System (INIS)

    Lovelace, Geoffrey; Scheel, Mark A.; Szilagyi, Bela

    2011-01-01

    Astrophysically realistic black holes may have spins that are nearly extremal (i.e., close to 1 in dimensionless units). Numerical simulations of binary black holes are important tools both for calibrating analytical templates for gravitational-wave detection and for exploring the nonlinear dynamics of curved spacetime. However, all previous simulations of binary-black-hole inspiral, merger, and ringdown have been limited by an apparently insurmountable barrier: the merging holes' spins could not exceed 0.93, which is still a long way from the maximum possible value in terms of the physical effects of the spin. In this paper, we surpass this limit for the first time, opening the way to explore numerically the behavior of merging, nearly extremal black holes. Specifically, using an improved initial-data method suitable for binary black holes with nearly extremal spins, we simulate the inspiral (through 12.5 orbits), merger and ringdown of two equal-mass black holes with equal spins of magnitude 0.95 antialigned with the orbital angular momentum.

  12. Improving plot- and regional-scale crop models for simulating impacts of climate variability and extremes

    Science.gov (United States)

    Tao, F.; Rötter, R.

    2013-12-01

    Many studies on global climate report that climate variability is increasing with more frequent and intense extreme events1. There are quite large uncertainties from both the plot- and regional-scale models in simulating impacts of climate variability and extremes on crop development, growth and productivity2,3. One key to reducing the uncertainties is better exploitation of experimental data to eliminate crop model deficiencies and develop better algorithms that more adequately capture the impacts of extreme events, such as high temperature and drought, on crop performance4,5. In the present study, in a first step, the inter-annual variability in wheat yield and climate from 1971 to 2012 in Finland was investigated. Using statistical approaches the impacts of climate variability and extremes on wheat growth and productivity were quantified. In a second step, a plot-scale model, WOFOST6, and a regional-scale crop model, MCWLA7, were calibrated and validated, and applied to simulate wheat growth and yield variability from 1971-2012. Next, the estimated impacts of high temperature stress, cold damage, and drought stress on crop growth and productivity based on the statistical approaches, and on crop simulation models WOFOST and MCWLA were compared. Then, the impact mechanisms of climate extremes on crop growth and productivity in the WOFOST model and MCWLA model were identified, and subsequently, the various algorithm and impact functions were fitted against the long-term crop trial data. Finally, the impact mechanisms, algorithms and functions in WOFOST model and MCWLA model were improved to better simulate the impacts of climate variability and extremes, particularly high temperature stress, cold damage and drought stress for location-specific and large area climate impact assessments. Our studies provide a good example of how to improve, in parallel, the plot- and regional-scale models for simulating impacts of climate variability and extremes, as needed for

  13. Extreme climate in China. Facts, simulation and projection

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Hui-Jun; Sun, Jian-Qi; Chen, Huo-Po; Zhu, Ya-Li; Zhang, Ying; Jiang, Da-Bang; Lang, Xian-Mei; Fan, Ke; Yu, En-Tao [Chinese Academy of Sciences, Beijing (China). Inst. of Atmospheric Physics; Yang, Song [NOAA Climate Prediction Center, Camp Springs, MD (United States)

    2012-06-15

    In this paper, studies on extreme climate in China including extreme temperature and precipitation, dust weather activity, tropical cyclone activity, intense snowfall and cold surge activity, floods, and droughts are reviewed based on the peer-reviewed publications in recent decades. The review is focused first on the climatological features, variability, and trends in the past half century and then on simulations and projections based on global and regional climate models. As the annual mean surface air temperature (SAT) increased throughout China, heat wave intensity and frequency overall increased in the past half century, with a large rate after the 1980s. The daily or yearly minimum SAT increased more significantly than the mean or maximum SAT. The long-term change in precipitation is predominantly characterized by the so-called southern flood and northern drought pattern in eastern China and by the overall increase over Northwest China. The interdecadal variation of monsoon, represented by the monsoon weakening in the end of 1970s, is largely responsible for this change in mean precipitation. Precipitation-related extreme events (e.g., heavy rainfall and intense snowfall) have become more frequent and intense generally over China in the recent years, with large spatial features. Dust weather activity, however, has become less frequent over northern China in the recent years, as result of weakened cold surge activity, reinforced precipitation, and improved vegetation condition. State-of-the-art climate models are capable of reproducing some features of the mean climate and extreme climate events. However, discrepancies among models in simulating and projecting the mean and extreme climate are also demonstrated by many recent studies. Regional models with higher resolutions often perform better than global models. To predict and project climate variations and extremes, many new approaches and schemes based on dynamical models, statistical methods, or their

  14. Evaluation of global fine-resolution precipitation products and their uncertainty quantification in ensemble discharge simulations

    Science.gov (United States)

    Qi, W.; Zhang, C.; Fu, G.; Sweetapple, C.; Zhou, H.

    2016-02-01

    The applicability of six fine-resolution precipitation products, including precipitation radar, infrared, microwave and gauge-based products, using different precipitation computation recipes, is evaluated using statistical and hydrological methods in northeastern China. In addition, a framework quantifying uncertainty contributions of precipitation products, hydrological models, and their interactions to uncertainties in ensemble discharges is proposed. The investigated precipitation products are Tropical Rainfall Measuring Mission (TRMM) products (TRMM3B42 and TRMM3B42RT), Global Land Data Assimilation System (GLDAS)/Noah, Asian Precipitation - Highly-Resolved Observational Data Integration Towards Evaluation of Water Resources (APHRODITE), Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN), and a Global Satellite Mapping of Precipitation (GSMAP-MVK+) product. Two hydrological models of different complexities, i.e. a water and energy budget-based distributed hydrological model and a physically based semi-distributed hydrological model, are employed to investigate the influence of hydrological models on simulated discharges. Results show APHRODITE has high accuracy at a monthly scale compared with other products, and GSMAP-MVK+ shows huge advantage and is better than TRMM3B42 in relative bias (RB), Nash-Sutcliffe coefficient of efficiency (NSE), root mean square error (RMSE), correlation coefficient (CC), false alarm ratio, and critical success index. These findings could be very useful for validation, refinement, and future development of satellite-based products (e.g. NASA Global Precipitation Measurement). Although large uncertainty exists in heavy precipitation, hydrological models contribute most of the uncertainty in extreme discharges. Interactions between precipitation products and hydrological models can have the similar magnitude of contribution to discharge uncertainty as the hydrological models. A

  15. Derivative-free optimization under uncertainty applied to costly simulators

    International Nuclear Information System (INIS)

    Pauwels, Benoit

    2016-01-01

    The modeling of complex phenomena encountered in industrial issues can lead to the study of numerical simulation codes. These simulators may require extensive execution time (from hours to days), involve uncertain parameters and even be intrinsically stochastic. Importantly within the context of simulation-based optimization, the derivatives of the outputs with respect to the inputs may be inexistent, inaccessible or too costly to approximate reasonably. This thesis is organized in four chapters. The first chapter discusses the state of the art in derivative-free optimization and uncertainty modeling. The next three chapters introduce three independent - although connected - contributions to the field of derivative-free optimization in the presence of uncertainty. The second chapter addresses the emulation of costly stochastic simulation codes - stochastic in the sense simulations run with the same input parameters may lead to distinct outputs. Such was the matter of the CODESTOCH project carried out at the Summer mathematical research center on scientific computing and its applications (CEMRACS) during the summer of 2013, together with two Ph.D. students from Electricity of France (EDF) and the Atomic Energy and Alternative Energies Commission (CEA). We designed four methods to build emulators for functions whose values are probability density functions. These methods were tested on two toy functions and applied to industrial simulation codes concerned with three complex phenomena: the spatial distribution of molecules in a hydrocarbon system (IFPEN), the life cycle of large electric transformers (EDF) and the repercussions of a hypothetical accidental in a nuclear plant (CEA). Emulation was a preliminary process towards optimization in the first two cases. In the third chapter we consider the influence of inaccurate objective function evaluations on direct search - a classical derivative-free optimization method. In real settings inaccuracy may never vanish

  16. Lower extremity finite element model for crash simulation

    Energy Technology Data Exchange (ETDEWEB)

    Schauer, D.A.; Perfect, S.A.

    1996-03-01

    A lower extremity model has been developed to study occupant injury mechanisms of the major bones and ligamentous soft tissues resulting from vehicle collisions. The model is based on anatomically correct digitized bone surfaces of the pelvis, femur, patella and the tibia. Many muscles, tendons and ligaments were incrementally added to the basic bone model. We have simulated two types of occupant loading that occur in a crash environment using a non-linear large deformation finite element code. The modeling approach assumed that the leg was passive during its response to the excitation, that is, no active muscular contraction and therefore no active change in limb stiffness. The approach recognized that the most important contributions of the muscles to the lower extremity response are their ability to define and modify the impedance of the limb. When nonlinear material behavior in a component of the leg model was deemed important to response, a nonlinear constitutive model was incorporated. The accuracy of these assumptions can be verified only through a review of analysis results and careful comparison with test data. As currently defined, the model meets the objective for which it was created. Much work remains to be done, both from modeling and analysis perspectives, before the model can be considered complete. The model implements a modeling philosophy that can accurately capture both kinematic and kinetic response of the lower limb. We have demonstrated that the lower extremity model is a valuable tool for understanding the injury processes and mechanisms. We are now in a position to extend the computer simulation to investigate the clinical fracture patterns observed in actual crashes. Additional experience with this model will enable us to make a statement on what measures are needed to significantly reduce lower extremity injuries in vehicle crashes. 6 refs.

  17. Communicating Climate Uncertainties: Challenges and Opportunities Related to Spatial Scales, Extreme Events, and the Warming 'Hiatus'

    Science.gov (United States)

    Casola, J. H.; Huber, D.

    2013-12-01

    Many media, academic, government, and advocacy organizations have achieved sophistication in developing effective messages based on scientific information, and can quickly translate salient aspects of emerging climate research and evolving observations. However, there are several ways in which valid messages can be misconstrued by decision makers, leading them to inaccurate conclusions about the risks associated with climate impacts. Three cases will be discussed: 1) Issues of spatial scale in interpreting climate observations: Local climate observations may contradict summary statements about the effects of climate change on larger regional or global spatial scales. Effectively addressing these differences often requires communicators to understand local and regional climate drivers, and the distinction between a 'signal' associated with climate change and local climate 'noise.' Hydrological statistics in Missouri and California are shown to illustrate this case. 2) Issues of complexity related to extreme events: Climate change is typically invoked following a wide range of damaging meteorological events (e.g., heat waves, landfalling hurricanes, tornadoes), regardless of the strength of the relationship between anthropogenic climate change and the frequency or severity of that type of event. Examples are drawn from media coverage of several recent events, contrasting useful and potentially confusing word choices and frames. 3) Issues revolving around climate sensitivity: The so-called 'pause' or 'hiatus' in global warming has reverberated strongly through political and business discussions of climate change. Addressing the recent slowdown in warming yields an important opportunity to raise climate literacy in these communities. Attempts to use recent observations as a wedge between climate 'believers' and 'deniers' is likely to be counterproductive. Examples are drawn from Congressional testimony and media stories. All three cases illustrate ways that decision

  18. Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow

    Science.gov (United States)

    Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca

    2017-11-01

    The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.

  19. An evaluation of the uncertainty of extreme events statistics at the WMO/CIMO Lead Centre on precipitation intensity

    Science.gov (United States)

    Colli, M.; Lanza, L. G.; La Barbera, P.

    2012-12-01

    the uncertainty budget of modern rain gauges is also shown . The analysis proceeds with the laboratory simulation of the annual maximum rainfall events recorded for different durations at the Villa Cambiaso meteo-station (University of Genova) over the last two decades. Results are reported and discussed in a comparative form involving the derived extreme events statistics. REFERENCES La Barbera P., Lanza L.G. and Stagi L. (2002). Influence of systematic mechanical errors of tipping-bucket rain gauges on the statistics of rainfall extremes. Water Sci. Techn., 45(2), 1-9. Colli M., Lanza L.G., and Chan P.W. (2011). Co-located tipping-bucket and optical drop counter RI measurements and a simulated correction algorithm, Atmos. Res., doi:10.1016/j.atmosres.2011.07.018 Colli M., Lanza L.G., La Barbera P. (2012). Weighing gauges measurement errors and the design rainfall for urban scale applications. 9th International workshop on precipitation in urban areas. St.Moritz, Switzerland, 6-9 December 2012 Lanza L.G. and Vuerich E. (2009). The WMO Field Intercomparison of Rain Intensity Gauges. Atmos. Res., 94, 534-543.

  20. Uncertainty in Measurement: A Review of Monte Carlo Simulation Using Microsoft Excel for the Calculation of Uncertainties Through Functional Relationships, Including Uncertainties in Empirically Derived Constants

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-01-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more ‘constants’, each of which has an empirically derived numerical value. Such empirically derived ‘constants’ must also have associated uncertainties which propagate through the functional

  1. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    Science.gov (United States)

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  2. Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease

    Science.gov (United States)

    Marsden, Alison

    2009-11-01

    Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.

  3. ARIANNE. Analytical uncertainties. Simulation of influential factors in the inventory of the final web cam

    International Nuclear Information System (INIS)

    Morales Prieto, M.; Ortega Saiz, P.

    2011-01-01

    Analysis of analytical uncertainties of the methodology of simulation of processes for obtaining isotopic ending inventory of spent fuel, the ARIANE experiment explores the part of simulation of burning.

  4. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2O emissions

    Science.gov (United States)

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...

  5. Uncertainty

    International Nuclear Information System (INIS)

    Silva, T.A. da

    1988-01-01

    The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt

  6. Comparative Performance of Four Single Extreme Outlier Discordancy Tests from Monte Carlo Simulations

    Directory of Open Access Journals (Sweden)

    Surendra P. Verma

    2014-01-01

    Full Text Available Using highly precise and accurate Monte Carlo simulations of 20,000,000 replications and 102 independent simulation experiments with extremely low simulation errors and total uncertainties, we evaluated the performance of four single outlier discordancy tests (Grubbs test N2, Dixon test N8, skewness test N14, and kurtosis test N15 for normal samples of sizes 5 to 20. Statistical contaminations of a single observation resulting from parameters called δ from ±0.1 up to ±20 for modeling the slippage of central tendency or ε from ±1.1 up to ±200 for slippage of dispersion, as well as no contamination (δ=0 and ε=±1, were simulated. Because of the use of precise and accurate random and normally distributed simulated data, very large replications, and a large number of independent experiments, this paper presents a novel approach for precise and accurate estimations of power functions of four popular discordancy tests and, therefore, should not be considered as a simple simulation exercise unrelated to probability and statistics. From both criteria of the Power of Test proposed by Hayes and Kinsella and the Test Performance Criterion of Barnett and Lewis, Dixon test N8 performs less well than the other three tests. The overall performance of these four tests could be summarized as N2≅N15>N14>N8.

  7. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    International Nuclear Information System (INIS)

    Wan, Chenghui; Cao, Liangzhi; Wu, Hongchun; Shen, Wei

    2017-01-01

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k_e_f_f and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  8. Uncertainty analysis for the assembly and core simulation of BEAVRS at the HZP conditions

    Energy Technology Data Exchange (ETDEWEB)

    Wan, Chenghui [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Cao, Liangzhi, E-mail: caolz@mail.xjtu.edu.cn [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Wu, Hongchun [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Shen, Wei [School of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an 710049 (China); Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2017-04-15

    Highlights: • Uncertainty analysis has been completed based on the “two-step” scheme. • Uncertainty analysis has been performed to BEAVRS at HZP. • For lattice calculations, the few-group constant’s uncertainty was quantified. • For core simulation, uncertainties of k{sub eff} and power distributions were quantified. - Abstract: Based on the “two-step” scheme for the reactor-physics calculations, the capability of uncertainty analysis for the core simulations has been implemented in the UNICORN code, an in-house code for the sensitivity and uncertainty analysis of the reactor-physics calculations. Applying the statistical sampling method, the nuclear-data uncertainties can be propagated to the important predictions of the core simulations. The uncertainties of the few-group constants introduced by the uncertainties of the multigroup microscopic cross sections are quantified first for the lattice calculations; the uncertainties of the few-group constants are then propagated to the core multiplication factor and core power distributions for the core simulations. Up to now, our in-house lattice code NECP-CACTI and the neutron-diffusion solver NECP-VIOLET have been implemented in UNICORN for the steady-state core simulations based on the “two-step” scheme. With NECP-CACTI and NECP-VIOLET, the modeling and simulation of the steady-state BEAVRS benchmark problem at the HZP conditions was performed, and the results were compared with those obtained by CASMO-4E. Based on the modeling and simulation, the UNICORN code has been applied to perform the uncertainty analysis for BAEVRS at HZP. The uncertainty results of the eigenvalues and two-group constants for the lattice calculations and the multiplication factor and the power distributions for the steady-state core simulations are obtained and analyzed in detail.

  9. Uncertainty in simulating wheat yields under climate change : Letter

    NARCIS (Netherlands)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Supit, I.

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic

  10. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1)

    DEFF Research Database (Denmark)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predict...

  11. Dynamically adaptive data-driven simulation of extreme hydrological flows

    Science.gov (United States)

    Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint

    2018-02-01

    Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.

  12. Dynamically adaptive data-driven simulation of extreme hydrological flows

    KAUST Repository

    Kumar Jain, Pushkar

    2017-12-27

    Hydrological hazards such as storm surges, tsunamis, and rainfall-induced flooding are physically complex events that are costly in loss of human life and economic productivity. Many such disasters could be mitigated through improved emergency evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses on the novel combination of methodologies in forward simulation and data assimilation. The forward geophysical model utilizes adaptive mesh refinement (AMR), a process by which a computational mesh can adapt in time and space based on the current state of a simulation. The forward solution is combined with ensemble based data assimilation methods, whereby observations from an event are assimilated into the forward simulation to improve the veracity of the solution, or used to invert for uncertain physical parameters. The novelty in our approach is the tight two-way coupling of AMR and ensemble filtering techniques. The technology is tested using actual data from the Chile tsunami event of February 27, 2010. These advances offer the promise of significantly transforming data-driven, real-time modeling of hydrological hazards, with potentially broader applications in other science domains.

  13. Multiobjective generalized extremal optimization algorithm for simulation of daylight illuminants

    Science.gov (United States)

    Kumar, Srividya Ravindra; Kurian, Ciji Pearl; Gomes-Borges, Marcos Eduardo

    2017-10-01

    Daylight illuminants are widely used as references for color quality testing and optical vision testing applications. Presently used daylight simulators make use of fluorescent bulbs that are not tunable and occupy more space inside the quality testing chambers. By designing a spectrally tunable LED light source with an optimal number of LEDs, cost, space, and energy can be saved. This paper describes an application of the generalized extremal optimization (GEO) algorithm for selection of the appropriate quantity and quality of LEDs that compose the light source. The multiobjective approach of this algorithm tries to get the best spectral simulation with minimum fitness error toward the target spectrum, correlated color temperature (CCT) the same as the target spectrum, high color rendering index (CRI), and luminous flux as required for testing applications. GEO is a global search algorithm based on phenomena of natural evolution and is especially designed to be used in complex optimization problems. Several simulations have been conducted to validate the performance of the algorithm. The methodology applied to model the LEDs, together with the theoretical basis for CCT and CRI calculation, is presented in this paper. A comparative result analysis of M-GEO evolutionary algorithm with the Levenberg-Marquardt conventional deterministic algorithm is also presented.

  14. Modeling, design, and simulation of systems with uncertainties

    CERN Document Server

    Rauh, Andreas

    2011-01-01

    This three-fold contribution to the field covers both theory and current research in algorithmic approaches to uncertainty handling, real-life applications such as robotics and biomedical engineering, and fresh approaches to reliably implementing software.

  15. Precipitation intensity-duration-frequency curves for central Belgium with an ensemble of EURO-CORDEX simulations, and associated uncertainties

    Science.gov (United States)

    Hosseinzadehtalaei, Parisa; Tabari, Hossein; Willems, Patrick

    2018-02-01

    An ensemble of 88 regional climate model (RCM) simulations at 0.11° and 0.44° spatial resolutions from the EURO-CORDEX project is analyzed for central Belgium to investigate the projected impact of climate change on precipitation intensity-duration-frequency (IDF) relationships and extreme precipitation quantiles typically used in water engineering designs. The rate of uncertainty arising from the choice of RCM, driving GCM, and radiative concentration pathway (RCP4.5 & RCP8.5) is quantified using a variance decomposition technique after reconstruction of missing data in GCM × RCM combinations. A comparative analysis between the historical simulations of the EURO-CORDEX 0.11° and 0.44° RCMs shows higher precipitation intensities by the finer resolution runs, leading to a larger overestimation of the observations-based IDFs by the 0.11° runs. The results reveal that making a temporal stationarity assumption for the climate system may lead to underestimation of precipitation quantiles up to 70% by the end of this century. This projected increase is generally larger for the 0.11° RCMs compared with the 0.44° RCMs. The relative changes in extreme precipitation do depend on return period and duration, indicating an amplification for larger return periods and for smaller durations. The variance decomposition approach generally identifies RCM as the most dominant component of uncertainty in changes of more extreme precipitation (return period of 10 years) for both 0.11° and 0.44° resolutions, followed by GCM and RCP scenario. The uncertainties associated with cross-contributions of RCMs, GCMs, and RCPs play a non-negligible role in the associated uncertainties of the changes.

  16. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  17. Uncertainty in prediction and simulation of flow in sewer systems

    DEFF Research Database (Denmark)

    Breinholt, Anders

    the uncertainty in the state variables. Additionally the observation noise is accounted for by a separate observation noise term. This approach is also referred to as stochastic grey-box modelling. A state dependent diffusion term was developed using a Lamperti transformation of the states, and implemented...... performance beyond the one-step. The reliability was satisfied for the one-step prediction but were increasingly biased as the prediction horizon was expanded, particularly in rainy periods. GLUE was applied for estimating uncertainty in such a way that the selection of behavioral parameter sets continued....... Conversely the parameter estimates of the stochastic approach are physically meaningful. This thesis has contributed to developing simplified rainfall-runoff models that are suitable for model predictive control of urban drainage systems that takes uncertainty into account....

  18. Numerical Simulation of Floating Bodies in Extreme Free Surface Waves

    Science.gov (United States)

    Hu, Zheng Zheng; Causon, Derek; Mingham, Clive; Qiang, Ling

    2010-05-01

    and efficient. Firstly, extreme design wave conditions are generated in an empty NWT and compared with physical experiments as a precursor to calculations to investigate the survivability of the Bobber device operating in a challenging wave climate. Secondly, we consider a bench-mark test case involving in a first order regular wave maker acting on a fixed cylinder and Pelamis. Finally, a floating Bobber has been simulated under extreme wave conditions. These results will be reported at the meeting. Causon D.M., Ingram D.M., Mingham C.G., Yang G. Pearson R.V. (2000). Calculation of shallow water flows using a Cartesian cut cell approach. Advances in Water resources, 23: 545-562. Causon D.M., Ingram D.M., Mingham C.G. (2000). A Cartesian cut cell method for shallow water flows with moving boundaries. Advances in Water resources, 24: 899-911. Dalzell J.F. 1999 A note on finite depth second-order wave-wave interactions. Appl. Ocean Res. 21, 105-111. Ning D.Z., Zang J., Liu S.X. Eatock Taylor R. Teng B. & Taylor P.H. 2009 Free surface and wave kinematics for nonlinear focused wave groups. J. Ocean Engineering. Accepted. Hu Z.Z., Causon D.M., Mingham C.M. and Qian L.(2009). Numerical wave tank study of a wave energy converter in heave. Proceedlings 19th ISOPE conference, Osaka, Japan Qian L., Causon D.M. & Mingham C.G., Ingram D.M. 2006 A free-surface capturing method for two fluid flows with moving bodies. Proc. Roy. Soc. London, Vol. A 462 21-42.

  19. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    NARCIS (Netherlands)

    Machguth, H.; Purves, R.S.; Oerlemans, J.; Hoelzle, M.; Paul, F.

    2008-01-01

    By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was

  20. Simulating space-time uncertainty in continental-scale gridded precipitation fields for agrometeorological modelling

    NARCIS (Netherlands)

    Wit, de A.J.W.; Bruin, de S.

    2006-01-01

    Previous analyses of the effects of uncertainty in precipitation fields on the output of EU Crop Growth Monitoring System (CGMS) demonstrated that the influence on simulated crop yield was limited at national scale, but considerable at local and regional scales. We aim to propagate uncertainty due

  1. The validation of evacuation simulation models through the analysis of behavioural uncertainty

    International Nuclear Information System (INIS)

    Lovreglio, Ruggiero; Ronchi, Enrico; Borri, Dino

    2014-01-01

    Both experimental and simulation data on fire evacuation are influenced by a component of uncertainty caused by the impact of the unexplained variance in human behaviour, namely behavioural uncertainty (BU). Evacuation model validation studies should include the study of this type of uncertainty during the comparison of experiments and simulation results. An evacuation model validation procedure is introduced in this paper to study the impact of BU. This methodology is presented through a case study for the comparison between repeated experimental data and simulation results produced by FDS+Evac, an evacuation model for the simulation of human behaviour in fire, which makes use of distribution laws. - Highlights: • Validation of evacuation models is investigated. • Quantitative evaluation of behavioural uncertainty is performed. • A validation procedure is presented through an evacuation case study

  2. Estimation of balance uncertainty using Direct Monte Carlo Simulation (DSMC) on a CPU-GPU architecture

    CSIR Research Space (South Africa)

    Bidgood, Peter M

    2017-01-01

    Full Text Available The estimation of balance uncertainty using conventional statistical and error propagation methods has been found to be both approximate and laborious to the point of being untenable. Direct Simulation by Monte Carlo (DSMC) has been shown...

  3. Error and Uncertainty Analysis for Ecological Modeling and Simulation

    Science.gov (United States)

    2001-12-01

    nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G...D. Goolsby 2001. Relating N inputs to the Mississippi River Basin and nitrate flux in the Lower Mississippi River: A comparison of approaches...Journal of Remote Sensing, 25(4):367-380. Wu, J., D.E. Jelinski, M. Luck, and P.T. Tueller, 2000. Multiscale analysis of landscape heterogeneity: scale

  4. Climate Variability and Weather Extremes: Model-Simulated and Historical Data. Chapter 9

    Science.gov (United States)

    Schubert, Siegfried D.; Lim, Young-Kwon

    2012-01-01

    basic mechanisms by which extremes vary is incomplete. As noted in IPCC (2007), Incomplete global data sets and remaining model uncertainties still restrict understanding of changes in extremes and attribution of changes to causes, although understanding of changes in the intensity, frequency and risk of extremes has improved. Separating decadal and other shorter-term variability from climate change impacts on extremes requires a better understanding of the processes responsible for the changes. In particular, the physical processes linking sea surface temperature changes to regional climate changes, and a basic understanding of the inherent variability in weather extremes and how that is impacted by atmospheric circulation changes at subseasonal to decadal and longer time scales, are still inadequately understood. Given the fundamental limitations in the time span and quality of global observations, substantial progress on these issues will rely increasingly on improvements in models, with observations continuing to play a critical role, though less as a detection tool, and more as a tool for addressing physical processes, and to insure the quality of the climate models and the verisimilitude of the simulations (CCSP SAP 1.3, 2008).

  5. Effects of climate model interdependency on the uncertainty quantification of extreme reinfall projections

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Madsen, H.; Rosbjerg, Dan

    are independent. This study investigates the validity of this assumption and its effects on the estimated probabilistic projections of the changes in the 95% quantile of wet days. The methodology is divided in two main parts. First, the interdependency of the ENSEMBLES RCMs is estimated using the methodology...... developed by Pennell and Reichler (2011). The results show that the projections from the ENSEMBLES RCMs cannot be assumed independent. This result is then used to estimate the uncertainty in climate model projections. A Bayesian approach has been developed using the procedure suggested by Tebaldi et al...

  6. Effects of climate model interdependency on the uncertainty quantification of extreme rainfall projections

    DEFF Research Database (Denmark)

    Sunyer, M. A.; Rosbjerg, Dan; Arnbjerg-Nielsen, Karsten

    2017-01-01

    are independent. This study investigates the validity of this assumption and its effects on the estimated probabilistic projections of the changes in the 95% quantile of wet days. The methodology is divided in two main parts. First, the interdependency of the ENSEMBLES RCMs is estimated using the methodology...... developed by Pennell and Reichler (2011). The results show that the projections from the ENSEMBLES RCMs cannot be assumed independent. This result is then used to estimate the uncertainty in climate model projections. A Bayesian approach has been developed using the procedure suggested by Tebaldi et al...

  7. Simulated trends of extreme climate indices for the Carpathian basin using outputs of different regional climate models

    Science.gov (United States)

    Pongracz, R.; Bartholy, J.; Szabo, P.; Pieczka, I.; Torma, C. S.

    2009-04-01

    Regional climatological effects of global warming may be recognized not only in shifts of mean temperature and precipitation, but in the frequency or intensity changes of different climate extremes. Several climate extreme indices are analyzed and compared for the Carpathian basin (located in Central/Eastern Europe) following the guidelines suggested by the joint WMO-CCl/CLIVAR Working Group on climate change detection. Our statistical trend analysis includes the evaluation of several extreme temperature and precipitation indices, e.g., the numbers of severe cold days, winter days, frost days, cold days, warm days, summer days, hot days, extremely hot days, cold nights, warm nights, the intra-annual extreme temperature range, the heat wave duration, the growing season length, the number of wet days (using several threshold values defining extremes), the maximum number of consecutive dry days, the highest 1-day precipitation amount, the greatest 5-day rainfall total, the annual fraction due to extreme precipitation events, etc. In order to evaluate the future trends (2071-2100) in the Carpathian basin, daily values of meteorological variables are obtained from the outputs of various regional climate model (RCM) experiments accomplished in the frame of the completed EU-project PRUDENCE (Prediction of Regional scenarios and Uncertainties for Defining EuropeaN Climate change risks and Effects). Horizontal resolution of the applied RCMs is 50 km. Both scenarios A2 and B2 are used to compare past and future trends of the extreme climate indices for the Carpathian basin. Furthermore, fine-resolution climate experiments of two additional RCMs adapted and run at the Department of Meteorology, Eotvos Lorand University are used to extend the trend analysis of climate extremes for the Carpathian basin. (1) Model PRECIS (run at 25 km horizontal resolution) was developed at the UK Met Office, Hadley Centre, and it uses the boundary conditions from the HadCM3 GCM. (2) Model Reg

  8. Effects of Boron and Graphite Uncertainty in Fuel for TREAT Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Vaughn, Kyle; Mausolff, Zander; Gonzalez, Esteban; DeHart, Mark; Goluoglu, Sedat

    2017-03-01

    Advanced modeling techniques and current computational capacity make full core TREAT simulations possible, with the goal of such simulations to understand the pre-test core and minimize the number of required calibrations. But, in order to simulate TREAT with a high degree of precision the reactor materials and geometry must also be modeled with a high degree of precision. This paper examines how uncertainty in the reported values of boron and graphite have an effect on simulations of TREAT.

  9. ASSESSING ASTROPHYSICAL UNCERTAINTIES IN DIRECT DETECTION WITH GALAXY SIMULATIONS

    International Nuclear Information System (INIS)

    Sloane, Jonathan D.; Buckley, Matthew R.; Brooks, Alyson M.; Governato, Fabio

    2016-01-01

    We study the local dark matter velocity distribution in simulated Milky Way-mass galaxies, generated at high resolution with both dark matter and baryons. We find that the dark matter in the solar neighborhood is influenced appreciably by the inclusion of baryons, increasing the speed of dark matter particles compared to dark matter-only simulations. The gravitational potential due to the presence of a baryonic disk increases the amount of high velocity dark matter, resulting in velocity distributions that are more similar to the Maxwellian Standard Halo Model than predicted from dark matter-only simulations. Furthermore, the velocity structures present in baryonic simulations possess a greater diversity than expected from dark matter-only simulations. We show that the impact on the direct detection experiments LUX, DAMA/Libra, and CoGeNT using our simulated velocity distributions, and explore how resolution and halo mass within the Milky Way’s estimated mass range impact the results. A Maxwellian fit to the velocity distribution tends to overpredict the amount of dark matter in the high velocity tail, even with baryons, and thus leads to overly optimistic direct detection bounds on models that are dependent on this region of phase space for an experimental signal. Our work further demonstrates that it is critical to transform simulated velocity distributions to the lab frame of reference, due to the fact that velocity structure in the solar neighborhood appears when baryons are included. There is more velocity structure present when baryons are included than in dark matter-only simulations. Even when baryons are included, the importance of the velocity structure is not as apparent in the Galactic frame of reference as in the Earth frame.

  10. ASSESSING ASTROPHYSICAL UNCERTAINTIES IN DIRECT DETECTION WITH GALAXY SIMULATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Sloane, Jonathan D.; Buckley, Matthew R.; Brooks, Alyson M. [Department of Physics and Astronomy, Rutgers University, Piscataway, NJ 08854 (United States); Governato, Fabio [Astronomy Department, University of Washington, Box 351580, Seattle, WA 98195-1580 (United States)

    2016-11-01

    We study the local dark matter velocity distribution in simulated Milky Way-mass galaxies, generated at high resolution with both dark matter and baryons. We find that the dark matter in the solar neighborhood is influenced appreciably by the inclusion of baryons, increasing the speed of dark matter particles compared to dark matter-only simulations. The gravitational potential due to the presence of a baryonic disk increases the amount of high velocity dark matter, resulting in velocity distributions that are more similar to the Maxwellian Standard Halo Model than predicted from dark matter-only simulations. Furthermore, the velocity structures present in baryonic simulations possess a greater diversity than expected from dark matter-only simulations. We show that the impact on the direct detection experiments LUX, DAMA/Libra, and CoGeNT using our simulated velocity distributions, and explore how resolution and halo mass within the Milky Way’s estimated mass range impact the results. A Maxwellian fit to the velocity distribution tends to overpredict the amount of dark matter in the high velocity tail, even with baryons, and thus leads to overly optimistic direct detection bounds on models that are dependent on this region of phase space for an experimental signal. Our work further demonstrates that it is critical to transform simulated velocity distributions to the lab frame of reference, due to the fact that velocity structure in the solar neighborhood appears when baryons are included. There is more velocity structure present when baryons are included than in dark matter-only simulations. Even when baryons are included, the importance of the velocity structure is not as apparent in the Galactic frame of reference as in the Earth frame.

  11. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  12. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes

    International Nuclear Information System (INIS)

    Garcia J, T.; Cardenas V, J.

    2015-09-01

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  13. Multimodel Uncertainty Changes in Simulated River Flows Induced by Human Impact Parameterizations

    Science.gov (United States)

    Liu, Xingcai; Tang, Qiuhong; Cui, Huijuan; Mu, Mengfei; Gerten Dieter; Gosling, Simon; Masaki, Yoshimitsu; Satoh, Yusuke; Wada, Yoshihide

    2017-01-01

    Human impacts increasingly affect the global hydrological cycle and indeed dominate hydrological changes in some regions. Hydrologists have sought to identify the human-impact-induced hydrological variations via parameterizing anthropogenic water uses in global hydrological models (GHMs). The consequently increased model complexity is likely to introduce additional uncertainty among GHMs. Here, using four GHMs, between-model uncertainties are quantified in terms of the ratio of signal to noise (SNR) for average river flow during 1971-2000 simulated in two experiments, with representation of human impacts (VARSOC) and without (NOSOC). It is the first quantitative investigation of between-model uncertainty resulted from the inclusion of human impact parameterizations. Results show that the between-model uncertainties in terms of SNRs in the VARSOC annual flow are larger (about 2 for global and varied magnitude for different basins) than those in the NOSOC, which are particularly significant in most areas of Asia and northern areas to the Mediterranean Sea. The SNR differences are mostly negative (-20 to 5, indicating higher uncertainty) for basin-averaged annual flow. The VARSOC high flow shows slightly lower uncertainties than NOSOC simulations, with SNR differences mostly ranging from -20 to 20. The uncertainty differences between the two experiments are significantly related to the fraction of irrigation areas of basins. The large additional uncertainties in VARSOC simulations introduced by the inclusion of parameterizations of human impacts raise the urgent need of GHMs development regarding a better understanding of human impacts. Differences in the parameterizations of irrigation, reservoir regulation and water withdrawals are discussed towards potential directions of improvements for future GHM development. We also discuss the advantages of statistical approaches to reduce the between-model uncertainties, and the importance of calibration of GHMs for not only

  14. Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2

    Science.gov (United States)

    Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.; hide

    2016-01-01

    Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.

  15. Event-by-event simulation of single-neutron experiments to test uncertainty relations

    International Nuclear Information System (INIS)

    Raedt, H De; Michielsen, K

    2014-01-01

    Results from a discrete-event simulation of a recent single-neutron experiment that tests Ozawa's generalization of Heisenberg's uncertainty relation are presented. The event-based simulation algorithm reproduces the results of the quantum theoretical description of the experiment but does not require the knowledge of the solution of a wave equation, nor does it rely on detailed concepts of quantum theory. In particular, the data from these non-quantum simulations satisfy uncertainty relations derived in the context of quantum theory. (paper)

  16. Dynamically adaptive data-driven simulation of extreme hydrological flows

    KAUST Repository

    Kumar Jain, Pushkar; Mandli, Kyle; Hoteit, Ibrahim; Knio, Omar; Dawson, Clint

    2017-01-01

    evacuation in real-time and through the development of resilient infrastructure based on knowledge of how systems respond to extreme events. Data-driven computational modeling is a critical technology underpinning these efforts. This investigation focuses

  17. Imperfect pitch: Gabor's uncertainty principle and the pitch of extremely brief sounds.

    Science.gov (United States)

    Hsieh, I-Hui; Saberi, Kourosh

    2016-02-01

    How brief must a sound be before its pitch is no longer perceived? The uncertainty tradeoff between temporal and spectral resolution (Gabor's principle) limits the minimum duration required for accurate pitch identification or discrimination. Prior studies have reported that pitch can be extracted from sinusoidal pulses as brief as half a cycle. This finding has been used in a number of classic papers to develop models of pitch encoding. We have found that phase randomization, which eliminates timbre confounds, degrades this ability to chance, raising serious concerns over the foundation on which classic pitch models have been built. The current study investigated whether subthreshold pitch cues may still exist in partial-cycle pulses revealed through statistical integration in a time series containing multiple pulses. To this end, we measured frequency-discrimination thresholds in a two-interval forced-choice task for trains of partial-cycle random-phase tone pulses. We found that residual pitch cues exist in these pulses but discriminating them requires an order of magnitude (ten times) larger frequency difference than that reported previously, necessitating a re-evaluation of pitch models built on earlier findings. We also found that as pulse duration is decreased to less than two cycles its pitch becomes biased toward higher frequencies, consistent with predictions of an auto-correlation model of pitch extraction.

  18. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M. [Duke Univ., Durham, NC (United States). Dept. of Mechanical Engineering and Materials Science

    2017-06-06

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather input in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.

  19. Evaluation of uncertainties in regional climate change simulations

    DEFF Research Database (Denmark)

    Pan, Z.; Christensen, J. H.; Arritt, R. W.

    2001-01-01

    , an atmosphere-ocean coupled general circulation model (GCM) current climate, and a future scenario of transient climate change. Common precipitation climatology features simulated by both models included realistic orographic precipitation, east-west transcontinental gradients, and reasonable annual cycles over...... to different subgrid scale processes in individual models. The ratio of climate change to biases, which we use as one measure of confidence in projected climate changes, is substantially larger than 1 in several seasons and regions while the ratios are always less than 1 in summer. The largest ratios among all...... regions are in California. Spatial correlation coefficients of precipitation were computed between simulation pairs in the 2x3 set. The climate change correlation is highest and the RCM performance correlation is lowest while boundary forcing and intermodel correlations are intermediate. The high spatial...

  20. Neural network stochastic simulation applied for quantifying uncertainties

    Directory of Open Access Journals (Sweden)

    N Foudil-Bey

    2016-09-01

    Full Text Available Generally the geostatistical simulation methods are used to generate several realizations of physical properties in the sub-surface, these methods are based on the variogram analysis and limited to measures correlation between variables at two locations only. In this paper, we propose a simulation of properties based on supervised Neural network training at the existing drilling data set. The major advantage is that this method does not require a preliminary geostatistical study and takes into account several points. As a result, the geological information and the diverse geophysical data can be combined easily. To do this, we used a neural network with multi-layer perceptron architecture like feed-forward, then we used the back-propagation algorithm with conjugate gradient technique to minimize the error of the network output. The learning process can create links between different variables, this relationship can be used for interpolation of the properties on the one hand, or to generate several possible distribution of physical properties on the other hand, changing at each time and a random value of the input neurons, which was kept constant until the period of learning. This method was tested on real data to simulate multiple realizations of the density and the magnetic susceptibility in three-dimensions at the mining camp of Val d'Or, Québec (Canada.

  1. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    Directory of Open Access Journals (Sweden)

    Artem Yankov

    2012-01-01

    Full Text Available For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor and in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.

  2. Predicting Statistical Response and Extreme Events in Uncertainty Quantification through Reduced-Order Models

    Science.gov (United States)

    Qi, D.; Majda, A.

    2017-12-01

    A low-dimensional reduced-order statistical closure model is developed for quantifying the uncertainty in statistical sensitivity and intermittency in principal model directions with largest variability in high-dimensional turbulent system and turbulent transport models. Imperfect model sensitivity is improved through a recent mathematical strategy for calibrating model errors in a training phase, where information theory and linear statistical response theory are combined in a systematic fashion to achieve the optimal model performance. The idea in the reduced-order method is from a self-consistent mathematical framework for general systems with quadratic nonlinearity, where crucial high-order statistics are approximated by a systematic model calibration procedure. Model efficiency is improved through additional damping and noise corrections to replace the expensive energy-conserving nonlinear interactions. Model errors due to the imperfect nonlinear approximation are corrected by tuning the model parameters using linear response theory with an information metric in a training phase before prediction. A statistical energy principle is adopted to introduce a global scaling factor in characterizing the higher-order moments in a consistent way to improve model sensitivity. Stringent models of barotropic and baroclinic turbulence are used to display the feasibility of the reduced-order methods. Principal statistical responses in mean and variance can be captured by the reduced-order models with accuracy and efficiency. Besides, the reduced-order models are also used to capture crucial passive tracer field that is advected by the baroclinic turbulent flow. It is demonstrated that crucial principal statistical quantities like the tracer spectrum and fat-tails in the tracer probability density functions in the most important large scales can be captured efficiently with accuracy using the reduced-order tracer model in various dynamical regimes of the flow field with

  3. Monte Carlo Simulation of Influence of Input Parameters Uncertainty on Output Data

    International Nuclear Information System (INIS)

    Sobek, Lukas

    2010-01-01

    Input parameters of a complex system in the probabilistic simulation are treated by means of probability density function (PDF). The result of the simulation have also probabilistic character. Monte Carlo simulation is widely used to obtain predictions concerning the probability of the risk. The Monte Carlo method was performed to calculate histograms of PDF for release rate given by uncertainty in distribution coefficient of radionuclides 135 Cs and 235 U.

  4. Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations

    Science.gov (United States)

    Niemeier, Wolfgang; Tengen, Dieter

    2017-06-01

    In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.

  5. Epistemic uncertainty in California-wide synthetic seismicity simulations

    Science.gov (United States)

    Pollitz, Fred F.

    2011-01-01

    The generation of seismicity catalogs on synthetic fault networks holds the promise of providing key inputs into probabilistic seismic-hazard analysis, for example, the coefficient of variation, mean recurrence time as a function of magnitude, the probability of fault-to-fault ruptures, and conditional probabilities for foreshock–mainshock triggering. I employ a seismicity simulator that includes the following ingredients: static stress transfer, viscoelastic relaxation of the lower crust and mantle, and vertical stratification of elastic and viscoelastic material properties. A cascade mechanism combined with a simple Coulomb failure criterion is used to determine the initiation, propagation, and termination of synthetic ruptures. It is employed on a 3D fault network provided by Steve Ward (unpublished data, 2009) for the Southern California Earthquake Center (SCEC) Earthquake Simulators Group. This all-California fault network, initially consisting of 8000 patches, each of ∼12 square kilometers in size, has been rediscretized into Graphic patches, each of ∼1 square kilometer in size, in order to simulate the evolution of California seismicity and crustal stress at magnitude M∼5–8. Resulting synthetic seismicity catalogs spanning 30,000 yr and about one-half million events are evaluated with magnitude-frequency and magnitude-area statistics. For a priori choices of fault-slip rates and mean stress drops, I explore the sensitivity of various constructs on input parameters, particularly mantle viscosity. Slip maps obtained for the southern San Andreas fault show that the ability of segment boundaries to inhibit slip across the boundaries (e.g., to prevent multisegment ruptures) is systematically affected by mantle viscosity.

  6. Range uncertainties in proton therapy and the role of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Paganetti, Harald

    2012-01-01

    The main advantages of proton therapy are the reduced total energy deposited in the patient as compared to photon techniques and the finite range of the proton beam. The latter adds an additional degree of freedom to treatment planning. The range in tissue is associated with considerable uncertainties caused by imaging, patient setup, beam delivery and dose calculation. Reducing the uncertainties would allow a reduction of the treatment volume and thus allow a better utilization of the advantages of protons. This paper summarizes the role of Monte Carlo simulations when aiming at a reduction of range uncertainties in proton therapy. Differences in dose calculation when comparing Monte Carlo with analytical algorithms are analyzed as well as range uncertainties due to material constants and CT conversion. Range uncertainties due to biological effects and the role of Monte Carlo for in vivo range verification are discussed. Furthermore, the current range uncertainty recipes used at several proton therapy facilities are revisited. We conclude that a significant impact of Monte Carlo dose calculation can be expected in complex geometries where local range uncertainties due to multiple Coulomb scattering will reduce the accuracy of analytical algorithms. In these cases Monte Carlo techniques might reduce the range uncertainty by several mm. (topical review)

  7. Uncertainty in simulated groundwater-quality trends in transient flow

    Science.gov (United States)

    Starn, J. Jeffrey; Bagtzoglou, Amvrossios; Robbins, Gary A.

    2013-01-01

    In numerical modeling of groundwater flow, the result of a given solution method is affected by the way in which transient flow conditions and geologic heterogeneity are simulated. An algorithm is demonstrated that simulates breakthrough curves at a pumping well by convolution-based particle tracking in a transient flow field for several synthetic basin-scale aquifers. In comparison to grid-based (Eulerian) methods, the particle (Lagrangian) method is better able to capture multimodal breakthrough caused by changes in pumping at the well, although the particle method may be apparently nonlinear because of the discrete nature of particle arrival times. Trial-and-error choice of number of particles and release times can perhaps overcome the apparent nonlinearity. Heterogeneous aquifer properties tend to smooth the effects of transient pumping, making it difficult to separate their effects in parameter estimation. Porosity, a new parameter added for advective transport, can be accurately estimated using both grid-based and particle-based methods, but predictions can be highly uncertain, even in the simple, nonreactive case.

  8. Assessing Fatigue and Ultimate Load Uncertainty in Floating Offshore Wind Turbines Due to Varying Simulation Length

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, G.; Lackner, M.; Haid, L.; Matha, D.; Jonkman, J.; Robertson, A.

    2013-07-01

    With the push towards siting wind turbines farther offshore due to higher wind quality and less visibility, floating offshore wind turbines, which can be located in deep water, are becoming an economically attractive option. The International Electrotechnical Commission's (IEC) 61400-3 design standard covers fixed-bottom offshore wind turbines, but there are a number of new research questions that need to be answered to modify these standards so that they are applicable to floating wind turbines. One issue is the appropriate simulation length needed for floating turbines. This paper will discuss the results from a study assessing the impact of simulation length on the ultimate and fatigue loads of the structure, and will address uncertainties associated with changing the simulation length for the analyzed floating platform. Recommendations of required simulation length based on load uncertainty will be made and compared to current simulation length requirements.

  9. Uncertainty-based simulation-optimization using Gaussian process emulation: Application to coastal groundwater management

    Science.gov (United States)

    Rajabi, Mohammad Mahdi; Ketabchi, Hamed

    2017-12-01

    Combined simulation-optimization (S/O) schemes have long been recognized as a valuable tool in coastal groundwater management (CGM). However, previous applications have mostly relied on deterministic seawater intrusion (SWI) simulations. This is a questionable simplification, knowing that SWI models are inevitably prone to epistemic and aleatory uncertainty, and hence a management strategy obtained through S/O without consideration of uncertainty may result in significantly different real-world outcomes than expected. However, two key issues have hindered the use of uncertainty-based S/O schemes in CGM, which are addressed in this paper. The first issue is how to solve the computational challenges resulting from the need to perform massive numbers of simulations. The second issue is how the management problem is formulated in presence of uncertainty. We propose the use of Gaussian process (GP) emulation as a valuable tool in solving the computational challenges of uncertainty-based S/O in CGM. We apply GP emulation to the case study of Kish Island (located in the Persian Gulf) using an uncertainty-based S/O algorithm which relies on continuous ant colony optimization and Monte Carlo simulation. In doing so, we show that GP emulation can provide an acceptable level of accuracy, with no bias and low statistical dispersion, while tremendously reducing the computational time. Moreover, five new formulations for uncertainty-based S/O are presented based on concepts such as energy distances, prediction intervals and probabilities of SWI occurrence. We analyze the proposed formulations with respect to their resulting optimized solutions, the sensitivity of the solutions to the intended reliability levels, and the variations resulting from repeated optimization runs.

  10. Measurement uncertainty of dissolution test of acetaminophen immediate release tablets using Monte Carlo simulations

    Directory of Open Access Journals (Sweden)

    Daniel Cancelli Romero

    2017-10-01

    Full Text Available ABSTRACT Analytical results are widely used to assess batch-by-batch conformity, pharmaceutical equivalence, as well as in the development of drug products. Despite this, few papers describing the measurement uncertainty estimation associated with these results were found in the literature. Here, we described a simple procedure used for estimating measurement uncertainty associated with the dissolution test of acetaminophen tablets. A fractionate factorial design was used to define a mathematical model that explains the amount of acetaminophen dissolved (% as a function of time of dissolution (from 20 to 40 minutes, volume of dissolution media (from 800 to 1000 mL, pH of dissolution media (from 2.0 to 6.8, and rotation speed (from 40 to 60 rpm. Using Monte Carlo simulations, we estimated measurement uncertainty for dissolution test of acetaminophen tablets (95.2 ± 1.0%, with a 95% confidence level. Rotation speed was the most important source of uncertainty, contributing about 96.2% of overall uncertainty. Finally, it is important to note that the uncertainty calculated in this paper reflects the expected uncertainty to the dissolution test, and does not consider variations in the content of acetaminophen.

  11. Attributing uncertainty in streamflow simulations due to variable inputs via the Quantile Flow Deviation metric

    Science.gov (United States)

    Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish

    2018-06-01

    Every model to characterise a real world process is affected by uncertainty. Selecting a suitable model is a vital aspect of engineering planning and design. Observation or input errors make the prediction of modelled responses more uncertain. By way of a recently developed attribution metric, this study is aimed at developing a method for analysing variability in model inputs together with model structure variability to quantify their relative contributions in typical hydrological modelling applications. The Quantile Flow Deviation (QFD) metric is used to assess these alternate sources of uncertainty. The Australian Water Availability Project (AWAP) precipitation data for four different Australian catchments is used to analyse the impact of spatial rainfall variability on simulated streamflow variability via the QFD. The QFD metric attributes the variability in flow ensembles to uncertainty associated with the selection of a model structure and input time series. For the case study catchments, the relative contribution of input uncertainty due to rainfall is higher than that due to potential evapotranspiration, and overall input uncertainty is significant compared to model structure and parameter uncertainty. Overall, this study investigates the propagation of input uncertainty in a daily streamflow modelling scenario and demonstrates how input errors manifest across different streamflow magnitudes.

  12. Numerical simulation of floating bodies in extreme free surface waves

    Directory of Open Access Journals (Sweden)

    Z. Z. Hu

    2011-02-01

    Full Text Available In this paper, we use the in-house Computational Fluid Dynamics (CFD flow code AMAZON-SC as a numerical wave tank (NWT to study wave loading on a wave energy converter (WEC device in heave motion. This is a surface-capturing method for two fluid flows that treats the free surface as contact surface in the density field that is captured automatically without special provision. A time-accurate artificial compressibility method and high resolution Godunov-type scheme are employed in both fluid regions (air/water. The Cartesian cut cell method can provide a boundary-fitted mesh for a complex geometry with no requirement to re-mesh globally or even locally for moving geometry, requiring only changes to cut cell data at the body contour. Extreme wave boundary conditions are prescribed in an empty NWT and compared with physical experiments prior to calculations of extreme waves acting on a floating Bobber-type device. The validation work also includes the wave force on a fixed cylinder compared with theoretical and experimental data under regular waves. Results include free surface elevations, vertical displacement of the float, induced vertical velocity and heave force for a typical Bobber geometry with a hemispherical base under extreme wave conditions.

  13. Analysis of actuator delay and its effect on uncertainty quantification for real-time hybrid simulation

    Science.gov (United States)

    Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai

    2017-10-01

    Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.

  14. Quantifying uncertainty due to internal variability using high-resolution regional climate model simulations

    Science.gov (United States)

    Gutmann, E. D.; Ikeda, K.; Deser, C.; Rasmussen, R.; Clark, M. P.; Arnold, J. R.

    2015-12-01

    The uncertainty in future climate predictions is as large or larger than the mean climate change signal. As such, any predictions of future climate need to incorporate and quantify the sources of this uncertainty. One of the largest sources comes from the internal, chaotic, variability within the climate system itself. This variability has been approximated using the 30 ensemble members of the Community Earth System Model (CESM) large ensemble. Here we examine the wet and dry end members of this ensemble for cool-season precipitation in the Colorado Rocky Mountains with a set of high-resolution regional climate model simulations. We have used the Weather Research and Forecasting model (WRF) to simulate the periods 1990-2000, 2025-2035, and 2070-2080 on a 4km grid. These simulations show that the broad patterns of change depicted in CESM are inherited by the high-resolution simulations; however, the differences in the height and location of the mountains in the WRF simulation, relative to the CESM simulation, means that the location and magnitude of the precipitation changes are very different. We further show that high-resolution simulations with the Intermediate Complexity Atmospheric Research model (ICAR) predict a similar spatial pattern in the change signal as WRF for these ensemble members. We then use ICAR to examine the rest of the CESM Large Ensemble as well as the uncertainty in the regional climate model due to the choice of physics parameterizations.

  15. Uncertainty of simulated groundwater levels arising from stochastic transient climate change scenarios

    Science.gov (United States)

    Goderniaux, Pascal; Brouyère, Serge; Blenkinsop, Stephen; Burton, Aidan; Fowler, Hayley; Dassargues, Alain

    2010-05-01

    applied not only to the mean of climatic variables, but also across the statistical distributions of these variables. This is important as these distributions are expected to change in the future, with more extreme rainfall events, separated by longer dry periods. (2) The novel approach used in this study can simulate transient climate change from 2010 to 2085, rather than time series representative of a stationary climate for the period 2071-2100. (3) The weather generator is used to generate a large number of equiprobable climate change scenarios for each RCM, representative of the natural variability of the weather. All of these scenarios are applied as input to the Geer basin model to assess the projected impact of climate change on groundwater levels, the uncertainty arising for different RCM projections and the uncertainty linked to natural climatic variability. Using the output results from all scenarios, 95% confidence intervals are calculated for each year and month between 2010 and 2085. The climate change scenarios for the Geer basin model predict hotter and drier summers and warmer and wetter winters. Considering the results of this study, it is very likely that groundwater levels and surface flow rates in the Geer basin will decrease by the end of the century. This is of concern because it also means that groundwater quantities available for abstraction will also decrease. However, this study also shows that the uncertainty of these projections is relatively large compared to the projected changes so that it remains difficult to confidently determine the magnitude of the decrease. The use and combination of an integrated surface - subsurface model and stochastic climate change scenarios has never been used in previous climate change impact studies on groundwater resources. It constitutes an innovation and is an important tool for helping water managers to take decisions.

  16. The magnitude and causes of uncertainty in global model simulations of cloud condensation nuclei

    Directory of Open Access Journals (Sweden)

    L. A. Lee

    2013-09-01

    Full Text Available Aerosol–cloud interaction effects are a major source of uncertainty in climate models so it is important to quantify the sources of uncertainty and thereby direct research efforts. However, the computational expense of global aerosol models has prevented a full statistical analysis of their outputs. Here we perform a variance-based analysis of a global 3-D aerosol microphysics model to quantify the magnitude and leading causes of parametric uncertainty in model-estimated present-day concentrations of cloud condensation nuclei (CCN. Twenty-eight model parameters covering essentially all important aerosol processes, emissions and representation of aerosol size distributions were defined based on expert elicitation. An uncertainty analysis was then performed based on a Monte Carlo-type sampling of an emulator built for each model grid cell. The standard deviation around the mean CCN varies globally between about ±30% over some marine regions to ±40–100% over most land areas and high latitudes, implying that aerosol processes and emissions are likely to be a significant source of uncertainty in model simulations of aerosol–cloud effects on climate. Among the most important contributors to CCN uncertainty are the sizes of emitted primary particles, including carbonaceous combustion particles from wildfires, biomass burning and fossil fuel use, as well as sulfate particles formed on sub-grid scales. Emissions of carbonaceous combustion particles affect CCN uncertainty more than sulfur emissions. Aerosol emission-related parameters dominate the uncertainty close to sources, while uncertainty in aerosol microphysical processes becomes increasingly important in remote regions, being dominated by deposition and aerosol sulfate formation during cloud-processing. The results lead to several recommendations for research that would result in improved modelling of cloud–active aerosol on a global scale.

  17. GLOBAL SIMULATION OF AN EXTREME ULTRAVIOLET IMAGING TELESCOPE WAVE

    International Nuclear Information System (INIS)

    Schmidt, J. M.; Ofman, L.

    2010-01-01

    We use the observation of an Extreme Ultraviolet Imaging Telescope (EIT) wave in the lower solar corona, seen with the two Solar Terrestrial Relations Observatory (STEREO) spacecraft in extreme ultraviolet light on 2007 May 19, to model the same event with a three-dimensional (3D) time-depending magnetohydrodynamic (MHD) code that includes solar coronal magnetic fields derived with Wilcox Solar Observatory magnetogram data, and a solar wind outflow accelerated with empirical heating functions. The model includes a coronal mass ejection (CME) of Gibson and Low flux rope type above the reconstructed active region with parameters adapted from observations to excite the EIT wave. We trace the EIT wave running as circular velocity enhancement around the launching site of the CME in the direction tangential to the sphere produced by the wave front, and compute the phase velocities of the wave front. We find that the phase velocities are in good agreement with theoretical values for a fast magnetosonic wave, derived with the physical parameters of the model, and with observed phase speeds of an incident EIT wave reflected by a coronal hole and running at about the same location. We also produce in our 3D MHD model the observed reflection of the EIT wave at the coronal hole boundary, triggered by the magnetic pressure difference between the wave front hitting the hole and the boundary magnetic fields of the coronal hole, and the response of the coronal hole, which leads to the generation of secondary reflected EIT waves radiating away in different directions than the incident EIT wave. This is the first 3D MHD model of an EIT wave triggered by a CME that includes realistic solar magnetic field, with results comparing favorably to STEREO Extreme Ultraviolet Imager observations.

  18. Assessing the Uncertainty of Tropical Cyclone Simulations in NCAR's Community Atmosphere Model

    Directory of Open Access Journals (Sweden)

    Kevin A Reed

    2011-08-01

    Full Text Available The paper explores the impact of the initial-data, parameter and structural model uncertainty on the simulation of a tropical cyclone-like vortex in the National Center for Atmospheric Research's (NCAR Community Atmosphere Model (CAM. An analytic technique is used to initialize the model with an idealized weak vortex that develops into a tropical cyclone over ten simulation days. A total of 78 ensemble simulations are performed at horizontal grid spacings of 1.0°, 0.5° and 0.25° using two recently released versions of the model, CAM 4 and CAM 5. The ensemble members represent simulations with random small-amplitude perturbations of the initial conditions, small shifts in the longitudinal position of the initial vortex and runs with slightly altered model parameters. The main distinction between CAM 4 and CAM 5 lies within the physical parameterization suite, and the simulations with both CAM versions at the varying resolutions assess the structural model uncertainty. At all resolutions storms are produced with many tropical cyclone-like characteristics. The CAM 5 simulations exhibit more intense storms than CAM 4 by day 10 at the 0.5° and 0.25° grid spacings, while the CAM 4 storm at 1.0° is stronger. There are also distinct differences in the shapes and vertical profiles of the storms in the two variants of CAM. The ensemble members show no distinction between the initial-data and parameter uncertainty simulations. At day 10 they produce ensemble root-mean-square deviations from an unperturbed control simulation on the order of 1--5 m s-1 for the maximum low-level wind speed and 2--10 hPa for the minimum surface pressure. However, there are large differences between the two CAM versions at identical horizontal resolutions. It suggests that the structural uncertainty is more dominant than the initial-data and parameter uncertainties in this study. The uncertainty among the ensemble members is assessed and quantified.

  19. Evaluating uncertainties in regional climate simulations over South America at the seasonal scale

    Energy Technology Data Exchange (ETDEWEB)

    Solman, Silvina A. [Centro de Investigaciones del Mar y la Atmosfera CIMA/CONICET-UBA, DCAO/FCEN, UMI-IFAECI/CNRS, CIMA-Ciudad Universitaria, Buenos Aires (Argentina); Pessacg, Natalia L. [Centro Nacional Patagonico (CONICET), Puerto Madryn, Chubut (Argentina)

    2012-07-15

    This work focuses on the evaluation of different sources of uncertainty affecting regional climate simulations over South America at the seasonal scale, using the MM5 model. The simulations cover a 3-month period for the austral spring season. Several four-member ensembles were performed in order to quantify the uncertainty due to: the internal variability; the definition of the regional model domain; the choice of physical parameterizations and the selection of physical parameters within a particular cumulus scheme. The uncertainty was measured by means of the spread among individual members of each ensemble during the integration period. Results show that the internal variability, triggered by differences in the initial conditions, represents the lowest level of uncertainty for every variable analyzed. The geographic distribution of the spread among ensemble members depends on the variable: for precipitation and temperature the largest spread is found over tropical South America while for the mean sea level pressure the largest spread is located over the southeastern Atlantic Ocean, where large synoptic-scale activity occurs. Using nudging techniques to ingest the boundary conditions reduces dramatically the internal variability. The uncertainty due to the domain choice displays a similar spatial pattern compared with the internal variability, except for the mean sea level pressure field, though its magnitude is larger all over the model domain for every variable. The largest spread among ensemble members is found for the ensemble in which different combinations of physical parameterizations are selected. The perturbed physics ensemble produces a level of uncertainty slightly larger than the internal variability. This study suggests that no matter what the source of uncertainty is, the geographical distribution of the spread among members of the ensembles is invariant, particularly for precipitation and temperature. (orig.)

  20. Error correction in multi-fidelity molecular dynamics simulations using functional uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Reeve, Samuel Temple; Strachan, Alejandro, E-mail: strachan@purdue.edu

    2017-04-01

    We use functional, Fréchet, derivatives to quantify how thermodynamic outputs of a molecular dynamics (MD) simulation depend on the potential used to compute atomic interactions. Our approach quantifies the sensitivity of the quantities of interest with respect to the input functions as opposed to its parameters as is done in typical uncertainty quantification methods. We show that the functional sensitivity of the average potential energy and pressure in isothermal, isochoric MD simulations using Lennard–Jones two-body interactions can be used to accurately predict those properties for other interatomic potentials (with different functional forms) without re-running the simulations. This is demonstrated under three different thermodynamic conditions, namely a crystal at room temperature, a liquid at ambient pressure, and a high pressure liquid. The method provides accurate predictions as long as the change in potential can be reasonably described to first order and does not significantly affect the region in phase space explored by the simulation. The functional uncertainty quantification approach can be used to estimate the uncertainties associated with constitutive models used in the simulation and to correct predictions if a more accurate representation becomes available.

  1. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    Science.gov (United States)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  2. Advanced Approach to Consider Aleatory and Epistemic Uncertainties for Integral Accident Simulations

    International Nuclear Information System (INIS)

    Peschke, Joerg; Kloos, Martina

    2013-01-01

    The use of best-estimate codes together with realistic input data generally requires that all potentially important epistemic uncertainties which may affect the code prediction are considered in order to get an adequate quantification of the epistemic uncertainty of the prediction as an expression of the existing imprecise knowledge. To facilitate the performance of the required epistemic uncertainty analyses, methods and corresponding software tools are available like, for instance, the GRS-tool SUSA (Software for Uncertainty and Sensitivity Analysis). However, for risk-informed decision-making, the restriction on epistemic uncertainties alone is not enough. Transients and accident scenarios are also affected by aleatory uncertainties which are due to the unpredictable nature of phenomena. It is essential that aleatory uncertainties are taken into account as well, not only in a simplified and supposedly conservative way but as realistic as possible. The additional consideration of aleatory uncertainties, for instance, on the behavior of the technical system, the performance of plant operators, or on the behavior of the physical process provides a quantification of probabilistically significant accident sequences. Only if a safety analysis is able to account for both epistemic and aleatory uncertainties in a realistic manner, it can provide a well-founded risk-informed answer for decision-making. At GRS, an advanced probabilistic dynamics method was developed to address this problem and to provide a more realistic modeling and assessment of transients and accident scenarios. This method allows for an integral simulation of complex dynamic processes particularly taking into account interactions between the plant dynamics as simulated by a best-estimate code, the dynamics of operator actions and the influence of epistemic and aleatory uncertainties. In this paper, the GRS method MCDET (Monte Carlo Dynamic Event Tree) for probabilistic dynamics analysis is explained

  3. Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows

    Energy Technology Data Exchange (ETDEWEB)

    Templeton, Jeremy Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Blaylock, Myra L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Domino, Stefan P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hewson, John C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kumar, Pritvi Raj [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ling, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Najm, Habib N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ruiz, Anthony [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stewart, Alessia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wagner, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.

  4. Simulation of the 23 July 2012 Extreme Space Weather Event: What if This Extremely Rare CME Was Earth Directed?

    Science.gov (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Mays, M. Leila; Kuznetsova, Maria M.; Galvin, A. B.; Simunac, Kristin; Baker, Daniel N.; Li, Xinlin; Zheng, Yihua; Glocer, Alex

    2013-01-01

    Extreme space weather events are known to cause adverse impacts on critical modern day technological infrastructure such as high-voltage electric power transmission grids. On 23 July 2012, NASA's Solar Terrestrial Relations Observatory-Ahead (STEREO-A) spacecraft observed in situ an extremely fast coronal mass ejection (CME) that traveled 0.96 astronomical units (approx. 1 AU) in about 19 h. Here we use the SpaceWeather Modeling Framework (SWMF) to perform a simulation of this rare CME.We consider STEREO-A in situ observations to represent the upstream L1 solar wind boundary conditions. The goal of this study is to examine what would have happened if this Rare-type CME was Earth-bound. Global SWMF-generated ground geomagnetic field perturbations are used to compute the simulated induced geoelectric field at specific ground-based active INTERMAGNET magnetometer sites. Simulation results show that while modeled global SYM-H index, a high-resolution equivalent of the Dst index, was comparable to previously observed severe geomagnetic storms such as the Halloween 2003 storm, the 23 July CME would have produced some of the largest geomagnetically induced electric fields, making it very geoeffective. These results have important practical applications for risk management of electrical power grids.

  5. Using sequential indicator simulation to assess the uncertainty of delineating heavy-metal contaminated soils

    International Nuclear Information System (INIS)

    Juang, Kai-Wei; Chen, Yue-Shin; Lee, Dar-Yuan

    2004-01-01

    Mapping the spatial distribution of soil pollutants is essential for delineating contaminated areas. Currently, geostatistical interpolation, kriging, is increasingly used to estimate pollutant concentrations in soils. The kriging-based approach, indicator kriging (IK), may be used to model the uncertainty of mapping. However, a smoothing effect is usually produced when using kriging in pollutant mapping. The detailed spatial patterns of pollutants could, therefore, be lost. The local uncertainty of mapping pollutants derived by the IK technique is referred to as the conditional cumulative distribution function (ccdf) for one specific location (i.e. single-location uncertainty). The local uncertainty information obtained by IK is not sufficient as the uncertainty of mapping at several locations simultaneously (i.e. multi-location uncertainty or spatial uncertainty) is required to assess the reliability of the delineation of contaminated areas. The simulation approach, sequential indicator simulation (SIS), which has the ability to model not only single, but also multi-location uncertainties, was used, in this study, to assess the uncertainty of the delineation of heavy metal contaminated soils. To illustrate this, a data set of Cu concentrations in soil from Taiwan was used. The results show that contour maps of Cu concentrations generated by the SIS realizations exhausted all the spatial patterns of Cu concentrations without the smoothing effect found when using the kriging method. Based on the SIS realizations, the local uncertainty of Cu concentrations at a specific location of x', refers to the probability of the Cu concentration z(x') being higher than the defined threshold level of contamination (z c ). This can be written as Prob SIS [z(x')>z c ], representing the probability of contamination. The probability map of Prob SIS [z(x')>z c ] can then be used for delineating contaminated areas. In addition, the multi-location uncertainty of an area A

  6. Evaluation and uncertainties of global climate models as simulated in East Asia and China

    International Nuclear Information System (INIS)

    Zhao, Z.C.

    1994-01-01

    The assessments and uncertainties of the general circulation models (GCMs) as simulated in East Asia and China (15-60 N, 70-140 E) have been investigated by using seven GCMs. Four methods of assessment have been chosen. The variables for the validations for the GCMs include the annual, seasonal and monthly mean temperatures and precipitation. The assessments indicated that: (1) the simulations of seven GCMs for temperature are much better than those for precipitation; (2) the simulations in winter are much better than those in summer; (3) the simulations in eastern parts are much better than those in Western parts for both temperature and precipitation; (4) the best GCM for simulated temperature is the GISS model, and the best GCM for simulated precipitation is the UKMO-H model. The seven GCMs' means for both simulated temperature and precipitation provided good results. The range of uncertainties in East Asia and China due to human activities are presented. The differences between the GCMs for temperature and precipitation before the year 2050 are much smaller than those after the year 2050

  7. Multibody model of the human upper extremity for fracture simulation

    OpenAIRE

    Milanowicz, Marcin; K?dzior, Krzysztof

    2016-01-01

    About 3.8 million people are injured in accidents at work in Europe every year. The resulting high costs are incurred by the victims themselves, their families, employers and society. We have used a numerical simulation to reconstruct accidents at work for several years. To reconstruct these accidents MADYMO R7.5 with a numerical human model (pedestrian model) is used. However, this model is dedicated to the analysis of car-to-pedestrian accidents and thus cannot be fully used for reconstruct...

  8. Simulating and quantifying legacy topographic data uncertainty: an initial step to advancing topographic change analyses

    Science.gov (United States)

    Wasklewicz, Thad; Zhu, Zhen; Gares, Paul

    2017-12-01

    Rapid technological advances, sustained funding, and a greater recognition of the value of topographic data have helped develop an increasing archive of topographic data sources. Advances in basic and applied research related to Earth surface changes require researchers to integrate recent high-resolution topography (HRT) data with the legacy datasets. Several technical challenges and data uncertainty issues persist to date when integrating legacy datasets with more recent HRT data. The disparate data sources required to extend the topographic record back in time are often stored in formats that are not readily compatible with more recent HRT data. Legacy data may also contain unknown error or unreported error that make accounting for data uncertainty difficult. There are also cases of known deficiencies in legacy datasets, which can significantly bias results. Finally, scientists are faced with the daunting challenge of definitively deriving the extent to which a landform or landscape has or will continue to change in response natural and/or anthropogenic processes. Here, we examine the question: how do we evaluate and portray data uncertainty from the varied topographic legacy sources and combine this uncertainty with current spatial data collection techniques to detect meaningful topographic changes? We view topographic uncertainty as a stochastic process that takes into consideration spatial and temporal variations from a numerical simulation and physical modeling experiment. The numerical simulation incorporates numerous topographic data sources typically found across a range of legacy data to present high-resolution data, while the physical model focuses on more recent HRT data acquisition techniques. Elevation uncertainties observed from anchor points in the digital terrain models are modeled using "states" in a stochastic estimator. Stochastic estimators trace the temporal evolution of the uncertainties and are natively capable of incorporating sensor

  9. Hydrological simulation and uncertainty analysis using the improved TOPMODEL in the arid Manas River basin, China.

    Science.gov (United States)

    Xue, Lianqing; Yang, Fan; Yang, Changbing; Wei, Guanghui; Li, Wenqian; He, Xinlin

    2018-01-11

    Understanding the mechanism of complicated hydrological processes is important for sustainable management of water resources in an arid area. This paper carried out the simulations of water movement for the Manas River Basin (MRB) using the improved semi-distributed Topographic hydrologic model (TOPMODEL) with a snowmelt model and topographic index algorithm. A new algorithm is proposed to calculate the curve of topographic index using internal tangent circle on a conical surface. Based on the traditional model, the improved indicator of temperature considered solar radiation is used to calculate the amount of snowmelt. The uncertainty of parameters for the TOPMODEL model was analyzed using the generalized likelihood uncertainty estimation (GLUE) method. The proposed model shows that the distribution of the topographic index is concentrated in high mountains, and the accuracy of runoff simulation has certain enhancement by considering radiation. Our results revealed that the performance of the improved TOPMODEL is acceptable and comparable to runoff simulation in the MRB. The uncertainty of the simulations resulted from the parameters and structures of model, climatic and anthropogenic factors. This study is expected to serve as a valuable complement for widely application of TOPMODEL and identify the mechanism of hydrological processes in arid area.

  10. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    International Nuclear Information System (INIS)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose; Ortiz, J.; Pereira, Claubia

    2013-01-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  11. Uncertainty analysis in the simulation of an HPGe detector using the Monte Carlo Code MCNP5

    Energy Technology Data Exchange (ETDEWEB)

    Gallardo, Sergio; Pozuelo, Fausto; Querol, Andrea; Verdu, Gumersindo; Rodenas, Jose, E-mail: sergalbe@upv.es [Universitat Politecnica de Valencia, Valencia, (Spain). Instituto de Seguridad Industrial, Radiofisica y Medioambiental (ISIRYM); Ortiz, J. [Universitat Politecnica de Valencia, Valencia, (Spain). Servicio de Radiaciones. Lab. de Radiactividad Ambiental; Pereira, Claubia [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2013-07-01

    A gamma spectrometer including an HPGe detector is commonly used for environmental radioactivity measurements. Many works have been focused on the simulation of the HPGe detector using Monte Carlo codes such as MCNP5. However, the simulation of this kind of detectors presents important difficulties due to the lack of information from manufacturers and due to loss of intrinsic properties in aging detectors. Some parameters such as the active volume or the Ge dead layer thickness are many times unknown and are estimated during simulations. In this work, a detailed model of an HPGe detector and a petri dish containing a certified gamma source has been done. The certified gamma source contains nuclides to cover the energy range between 50 and 1800 keV. As a result of the simulation, the Pulse Height Distribution (PHD) is obtained and the efficiency curve can be calculated from net peak areas and taking into account the certified activity of the source. In order to avoid errors due to the net area calculation, the simulated PHD is treated using the GammaVision software. On the other hand, it is proposed to use the Noether-Wilks formula to do an uncertainty analysis of model with the main goal of determining the efficiency curve of this detector and its associated uncertainty. The uncertainty analysis has been focused on dead layer thickness at different positions of the crystal. Results confirm the important role of the dead layer thickness in the low energy range of the efficiency curve. In the high energy range (from 300 to 1800 keV) the main contribution to the absolute uncertainty is due to variations in the active volume. (author)

  12. Analyzing the uncertainty of ensemble-based gridded observations in land surface simulations and drought assessment

    Science.gov (United States)

    Ahmadalipour, Ali; Moradkhani, Hamid

    2017-12-01

    Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.

  13. Multi-fidelity numerical simulations of shock/turbulent-boundary layer interaction with uncertainty quantification

    Science.gov (United States)

    Bermejo-Moreno, Ivan; Campo, Laura; Larsson, Johan; Emory, Mike; Bodart, Julien; Palacios, Francisco; Iaccarino, Gianluca; Eaton, John

    2013-11-01

    We study the interaction between an oblique shock wave and the turbulent boundary layers inside a nearly-square duct by combining wall-modeled LES, 2D and 3D RANS simulations, targeting the experiment of Campo, Helmer & Eaton, 2012 (nominal conditions: M = 2 . 05 , Reθ = 6 , 500). A primary objective is to quantify the effect of aleatory and epistemic uncertainties on the STBLI. Aleatory uncertainties considered include the inflow conditions (Mach number of the incoming air stream and thickness of the boundary layers) and perturbations of the duct geometry upstream of the interaction. The epistemic uncertainty under consideration focuses on the RANS turbulence model form by injecting perturbations in the Reynolds stress anisotropy in regions of the flow where the model assumptions (in particular, the Boussinesq eddy-viscosity hypothesis) may be invalid. These perturbations are then propagated through the flow solver into the solution. The uncertainty quantification (UQ) analysis is done through 2D and 3D RANS simulations, assessing the importance of the three-dimensional effects imposed by the nearly-square duct geometry. Wall-modeled LES are used to verify elements of the UQ methodology and to explore the flow features and physics of the STBLI for multiple shock strengths. Financial support from the United States Department of Energy under the PSAAP program is gratefully acknowledged.

  14. Geostatistical simulation of geological architecture and uncertainty propagation in groundwater modeling

    DEFF Research Database (Denmark)

    He, Xiulan

    parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...... be compensated by model parameters, e.g. when hydraulic heads are considered. However, geological structure is the primary source of uncertainty with respect to simulations of groundwater age and capture zone. Operational MPS based software has been on stage for just around ten years; yet, issues regarding...... geological structures of these three sites provided appropriate conditions for testing the methods. Our study documented that MPS is an efficient approach for simulating geological heterogeneity, especially for non-stationary system. The high resolution of geophysical data such as SkyTEM is valuable both...

  15. Assessing the impact of model and climate uncertainty in malaria simulations for the Kenyan Highlands.

    Science.gov (United States)

    Tompkins, A. M.; Thomson, M. C.

    2017-12-01

    Simulations of the impact of climate variations on a vector-bornedisease such as malaria are subject to a number of sources ofuncertainty. These include the model structure and parameter settingsin addition to errors in the climate data and the neglect of theirspatial heterogeneity, especially over complex terrain. We use aconstrained genetic algorithm to confront these two sources ofuncertainty for malaria transmission in the highlands of Kenya. Thetechnique calibrates the parameter settings of a process-based,mathematical model of malaria transmission to vary within theirassessed level of uncertainty and also allows the calibration of thedriving climate data. The simulations show that in highland settingsclose to the threshold for sustained transmission, the uncertainty inclimate is more important to address than the malaria modeluncertainty. Applications of the coupled climate-malaria modelling system are briefly presented.

  16. Multi-Scale Fusion of Information for Uncertainty Quantification and Management in Large-Scale Simulations

    Science.gov (United States)

    2015-12-02

    of completely new nonlinear Malliavin calculus . This type of calculus is important for the analysis and simulation of stationary and/or “causal...been limited by the fact that it requires the solution of an optimization problem with noisy gradients . When using deterministic optimization schemes...under uncertainty. We tested new developments on nonlinear Malliavin calculus , combining reduced basis methods with ANOVA, model validation, on

  17. Modeling and simulating command and control for organizations under extreme situations

    CERN Document Server

    Moon, Il-Chul; Kim, Tag Gon

    2013-01-01

    Commanding and controlling organizations in extreme situations is a challenging task in military, intelligence, and disaster management. Such command and control must be quick, effective, and considerate when dealing with the changing, complex, and risky conditions of the situation. To enable optimal command and control under extremes, robust structures and efficient operations are required of organizations. This work discusses how to design and conduct virtual experiments on resilient organizational structures and operational practices using modeling and simulation. The work illustrates key a

  18. Investigation of hydrometeor classification uncertainties through the POLARRIS polarimetric radar simulator

    Science.gov (United States)

    Dolan, B.; Rutledge, S. A.; Barnum, J. I.; Matsui, T.; Tao, W. K.; Iguchi, T.

    2017-12-01

    POLarimetric Radar Retrieval and Instrument Simulator (POLARRIS) is a framework that has been developed to simulate radar observations from cloud resolving model (CRM) output and subject model data and observations to the same retrievals, analysis and visualization. This framework not only enables validation of bulk microphysical model simulated properties, but also offers an opportunity to study the uncertainties associated with retrievals such as hydrometeor classification (HID). For the CSU HID, membership beta functions (MBFs) are built using a set of simulations with realistic microphysical assumptions about axis ratio, density, canting angles, size distributions for each of ten hydrometeor species. These assumptions are tested using POLARRIS to understand their influence on the resulting simulated polarimetric data and final HID classification. Several of these parameters (density, size distributions) are set by the model microphysics, and therefore the specific assumptions of axis ratio and canting angle are carefully studied. Through these sensitivity studies, we hope to be able to provide uncertainties in retrieved polarimetric variables and HID as applied to CRM output. HID retrievals assign a classification to each point by determining the highest score, thereby identifying the dominant hydrometeor type within a volume. However, in nature, there is rarely just one a single hydrometeor type at a particular point. Models allow for mixing ratios of different hydrometeors within a grid point. We use the mixing ratios from CRM output in concert with the HID scores and classifications to understand how the HID algorithm can provide information about mixtures within a volume, as well as calculate a confidence in the classifications. We leverage the POLARRIS framework to additionally probe radar wavelength differences toward the possibility of a multi-wavelength HID which could utilize the strengths of different wavelengths to improve HID classifications. With

  19. Epistemic and aleatory uncertainties in integrated deterministic and probabilistic safety assessment: Tradeoff between accuracy and accident simulations

    International Nuclear Information System (INIS)

    Karanki, D.R.; Rahman, S.; Dang, V.N.; Zerkak, O.

    2017-01-01

    The coupling of plant simulation models and stochastic models representing failure events in Dynamic Event Trees (DET) is a framework used to model the dynamic interactions among physical processes, equipment failures, and operator responses. The integration of physical and stochastic models may additionally enhance the treatment of uncertainties. Probabilistic Safety Assessments as currently implemented propagate the (epistemic) uncertainties in failure probabilities, rates, and frequencies; while the uncertainties in the physical model (parameters) are not propagated. The coupling of deterministic (physical) and probabilistic models in integrated simulations such as DET allows both types of uncertainties to be considered. However, integrated accident simulations with epistemic uncertainties will challenge even today's high performance computing infrastructure, especially for simulations of inherently complex nuclear or chemical plants. Conversely, intentionally limiting computations for practical reasons would compromise accuracy of results. This work investigates how to tradeoff accuracy and computations to quantify risk in light of both uncertainties and accident dynamics. A simple depleting tank problem that can be solved analytically is considered to examine the adequacy of a discrete DET approach. The results show that optimal allocation of computational resources between epistemic and aleatory calculations by means of convergence studies ensures accuracy within a limited budget. - Highlights: • Accident simulations considering uncertainties require intensive computations. • Tradeoff between accuracy and accident simulations is a challenge. • Optimal allocation between epistemic & aleatory computations ensures the tradeoff. • Online convergence gives an early indication of computational requirements. • Uncertainty propagation in DDET is examined on a tank problem solved analytically.

  20. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  1. Quantifying uncertainty in Transcranial Magnetic Stimulation - A high resolution simulation study in ICBM space.

    Science.gov (United States)

    Toschi, Nicola; Keck, Martin E; Welt, Tobias; Guerrisi, Maria

    2012-01-01

    Transcranial Magnetic Stimulation offers enormous potential for noninvasive brain stimulation. While it is known that brain tissue significantly "reshapes" induced field and charge distributions, most modeling investigations to-date have focused on single-subject data with limited generality. Further, the effects of the significant uncertainties which exist in the simulation (i.e. brain conductivity distributions) and stimulation (e.g. coil positioning and orientations) setup have not been quantified. In this study, we construct a high-resolution anisotropic head model in standard ICBM space, which can be used as a population-representative standard for bioelectromagnetic simulations. Further, we employ Monte-Carlo simulations in order to quantify how uncertainties in conductivity values propagate all the way to induced field and currents, demonstrating significant, regionally dependent dispersions in values which are commonly assumed "ground truth". This framework can be leveraged in order to quantify the effect of any type of uncertainty in noninvasive brain stimulation and bears relevance in all applications of TMS, both investigative and therapeutic.

  2. Modeling extreme (Carrington-type) space weather events using three-dimensional MHD code simulations

    Science.gov (United States)

    Ngwira, C. M.; Pulkkinen, A. A.; Kuznetsova, M. M.; Glocer, A.

    2013-12-01

    There is growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure and systems. In the last two decades, significant progress has been made towards the modeling of space weather events. Three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, and have played a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for existing global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events that have a ground footprint comparable (or larger) to the Carrington superstorm. Results are presented for an initial simulation run with ``very extreme'' constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated ground induced geoelectric field to such extreme driving conditions. We also discuss the results and what they might mean for the accuracy of the simulations. The model is further tested using input data for an observed space weather event to verify the MHD model consistence and to draw guidance for future work. This extreme space weather MHD model is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in earth conductors such as power transmission grids.

  3. Numerical simulation of extremely chirped pulse formation with an optical fiber

    Energy Technology Data Exchange (ETDEWEB)

    Itoh, Tamitake; Nishimura, Akihiko; Tei, Kazuyoku; Matoba, Tohru; Takuma, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Yamashita, Mikio; Morita, Ryuji

    1998-03-01

    A nonlinear propagation code which used a symmetric split-step Fourier method as an algorithm was improved to simulate a propagation behavior of extremely chirped pulse in a long fiber. The performances of pulse propagation in noble gases cored hollow fibers and a pulse stretcher using a nonlinear and normal silicate fibers have been simulated by the code. The calculation results in the case of the hollow fiber are consistent with their experimental results. We estimated that this pulse stretcher could give a extremely chirped pulse whose spectral width was 84.2 nm and temporal duration was 1.5 ns. (author)

  4. A new simulation method for turbines in wake - Applied to extreme response during operation

    DEFF Research Database (Denmark)

    Thomsen, K.; Aagaard Madsen, H.

    2005-01-01

    The work focuses on prediction of load response for wind turbines operating in wind forms using a newly developed aeroelostic simulation method The traditionally used concept is to adjust the free flow turbulence intensity to account for increased loads in wind farms-a methodology that might......, the resulting extremes might be erroneous. For blade loads the traditionally used simplified approach works better than for integrated rotor loads-where the instantaneous load gradient across the rotor disc is causing the extreme loads. In the article the new wake simulation approach is illustrated...

  5. Effects of land cover change on temperature and rainfall extremes in multi-model ensemble simulations

    Directory of Open Access Journals (Sweden)

    A. J. Pitman

    2012-11-01

    Full Text Available The impact of historical land use induced land cover change (LULCC on regional-scale climate extremes is examined using four climate models within the Land Use and Climate, IDentification of robust impacts project. To assess those impacts, multiple indices based on daily maximum and minimum temperatures and daily precipitation were used. We contrast the impact of LULCC on extremes with the impact of an increase in atmospheric CO2 from 280 ppmv to 375 ppmv. In general, consistent changes in both high and low temperature extremes are similar to the simulated change in mean temperature caused by LULCC and are restricted to regions of intense modification. The impact of LULCC on both means and on most temperature extremes is statistically significant. While the magnitude of the LULCC-induced change in the extremes can be of similar magnitude to the response to the change in CO2, the impacts of LULCC are much more geographically isolated. For most models, the impacts of LULCC oppose the impact of the increase in CO2 except for one model where the CO2-caused changes in the extremes are amplified. While we find some evidence that individual models respond consistently to LULCC in the simulation of changes in rainfall and rainfall extremes, LULCC's role in affecting rainfall is much less clear and less commonly statistically significant, with the exception of a consistent impact over South East Asia. Since the simulated response of mean and extreme temperatures to LULCC is relatively large, we conclude that unless this forcing is included, we risk erroneous conclusions regarding the drivers of temperature changes over regions of intense LULCC.

  6. A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation

    Science.gov (United States)

    Byun, K.; Hamlet, A. F.

    2017-12-01

    There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.

  7. Assessment of groundwater level estimation uncertainty using sequential Gaussian simulation and Bayesian bootstrapping

    Science.gov (United States)

    Varouchakis, Emmanouil; Hristopulos, Dionissios

    2015-04-01

    Space-time geostatistical approaches can improve the reliability of dynamic groundwater level models in areas with limited spatial and temporal data. Space-time residual Kriging (STRK) is a reliable method for spatiotemporal interpolation that can incorporate auxiliary information. The method usually leads to an underestimation of the prediction uncertainty. The uncertainty of spatiotemporal models is usually estimated by determining the space-time Kriging variance or by means of cross validation analysis. For de-trended data the former is not usually applied when complex spatiotemporal trend functions are assigned. A Bayesian approach based on the bootstrap idea and sequential Gaussian simulation are employed to determine the uncertainty of the spatiotemporal model (trend and covariance) parameters. These stochastic modelling approaches produce multiple realizations, rank the prediction results on the basis of specified criteria and capture the range of the uncertainty. The correlation of the spatiotemporal residuals is modeled using a non-separable space-time variogram based on the Spartan covariance family (Hristopulos and Elogne 2007, Varouchakis and Hristopulos 2013). We apply these simulation methods to investigate the uncertainty of groundwater level variations. The available dataset consists of bi-annual (dry and wet hydrological period) groundwater level measurements in 15 monitoring locations for the time period 1981 to 2010. The space-time trend function is approximated using a physical law that governs the groundwater flow in the aquifer in the presence of pumping. The main objective of this research is to compare the performance of two simulation methods for prediction uncertainty estimation. In addition, we investigate the performance of the Spartan spatiotemporal covariance function for spatiotemporal geostatistical analysis. Hristopulos, D.T. and Elogne, S.N. 2007. Analytic properties and covariance functions for a new class of generalized Gibbs

  8. Regional Sea Level Scenarios for Coastal Risk Management: Managing the Uncertainty of Future Sea Level Change and Extreme Water Levels for Department of Defense Coastal Sites Worldwide

    Science.gov (United States)

    2016-04-01

    authors and do not necessarily reflect the view of the authors’ Agencies. MANAGING THE UNCERTAINTY OF FUTURE SEA LEVEL CHANGE AND EXTREME WATER LEVELS FOR...COASTAL RISK MANAGEMENT 2-20 contingent probabilities given their dependence on non-probabilistic emissions futures, have extended the ranges of...flood risk provides confidence in the associated projection as a true minimum value for risk management purposes. The contemporary rate observed by

  9. Value at risk (VaR in uncertainty: Analysis with parametric method and black & scholes simulations

    Directory of Open Access Journals (Sweden)

    Humberto Banda Ortiz

    2014-07-01

    Full Text Available VaR is the most accepted risk measure worldwide and the leading reference in any risk management assessment. However, its methodology has important limitations which makes it unreliable in contexts of crisis or high uncertainty. For this reason, the aim of this work is to test the VaR accuracy when is employed in contexts of volatility, for which we compare the VaR outcomes in scenarios of both stability and uncertainty, using the parametric method and a historical simulation based on data generated with the Black & Scholes model. VaR main objective is the prediction of the highest expected loss for any given portfolio, but even when it is considered a useful tool for risk management under conditions of markets stability, we found that it is substantially inaccurate in contexts of crisis or high uncertainty. In addition, we found that the Black & Scholes simulations lead to underestimate the expected losses, in comparison with the parametric method and we also found that those disparities increase substantially in times of crisis. In the first section of this work we present a brief context of risk management in finance. In section II we present the existent literature relative to the VaR concept, its methods and applications. In section III we describe the methodology and assumptions used in this work. Section IV is dedicated to expose the findings. And finally, in Section V we present our conclusions.

  10. Parameterization and Uncertainty Analysis of SWAT model in Hydrological Simulation of Chaohe River Basin

    Science.gov (United States)

    Jie, M.; Zhang, J.; Guo, B. B.

    2017-12-01

    As a typical distributed hydrological model, the SWAT model also has a challenge in calibrating parameters and analysis their uncertainty. This paper chooses the Chaohe River Basin China as the study area, through the establishment of the SWAT model, loading the DEM data of the Chaohe river basin, the watershed is automatically divided into several sub-basins. Analyzing the land use, soil and slope which are on the basis of the sub-basins and calculating the hydrological response unit (HRU) of the study area, after running SWAT model, the runoff simulation values in the watershed are obtained. On this basis, using weather data, known daily runoff of three hydrological stations, combined with the SWAT-CUP automatic program and the manual adjustment method are used to analyze the multi-site calibration of the model parameters. Furthermore, the GLUE algorithm is used to analyze the parameters uncertainty of the SWAT model. Through the sensitivity analysis, calibration and uncertainty study of SWAT, the results indicate that the parameterization of the hydrological characteristics of the Chaohe river is successful and feasible which can be used to simulate the Chaohe river basin.

  11. Pushing precipitation to the extremes in distributed experiments: Recommendations for simulating wet and dry years

    Science.gov (United States)

    Knapp, Alan K.; Avolio, Meghan L.; Beier, Claus; Carroll, Charles J.W.; Collins, Scott L.; Dukes, Jeffrey S.; Fraser, Lauchlan H.; Griffin-Nolan, Robert J.; Hoover, David L.; Jentsch, Anke; Loik, Michael E.; Phillips, Richard P.; Post, Alison K.; Sala, Osvaldo E.; Slette, Ingrid J.; Yahdjian, Laura; Smith, Melinda D.

    2017-01-01

    Intensification of the global hydrological cycle, ranging from larger individual precipitation events to more extreme multiyear droughts, has the potential to cause widespread alterations in ecosystem structure and function. With evidence that the incidence of extreme precipitation years (defined statistically from historical precipitation records) is increasing, there is a clear need to identify ecosystems that are most vulnerable to these changes and understand why some ecosystems are more sensitive to extremes than others. To date, opportunistic studies of naturally occurring extreme precipitation years, combined with results from a relatively small number of experiments, have provided limited mechanistic understanding of differences in ecosystem sensitivity, suggesting that new approaches are needed. Coordinated distributed experiments (CDEs) arrayed across multiple ecosystem types and focused on water can enhance our understanding of differential ecosystem sensitivity to precipitation extremes, but there are many design challenges to overcome (e.g., cost, comparability, standardization). Here, we evaluate contemporary experimental approaches for manipulating precipitation under field conditions to inform the design of ‘Drought-Net’, a relatively low-cost CDE that simulates extreme precipitation years. A common method for imposing both dry and wet years is to alter each ambient precipitation event. We endorse this approach for imposing extreme precipitation years because it simultaneously alters other precipitation characteristics (i.e., event size) consistent with natural precipitation patterns. However, we do not advocate applying identical treatment levels at all sites – a common approach to standardization in CDEs. This is because precipitation variability varies >fivefold globally resulting in a wide range of ecosystem-specific thresholds for defining extreme precipitation years. For CDEs focused on precipitation extremes, treatments should be based

  12. Self-optimized construction of transition rate matrices from accelerated atomistic simulations with Bayesian uncertainty quantification

    Science.gov (United States)

    Swinburne, Thomas D.; Perez, Danny

    2018-05-01

    A massively parallel method to build large transition rate matrices from temperature-accelerated molecular dynamics trajectories is presented. Bayesian Markov model analysis is used to estimate the expected residence time in the known state space, providing crucial uncertainty quantification for higher-scale simulation schemes such as kinetic Monte Carlo or cluster dynamics. The estimators are additionally used to optimize where exploration is performed and the degree of temperature acceleration on the fly, giving an autonomous, optimal procedure to explore the state space of complex systems. The method is tested against exactly solvable models and used to explore the dynamics of C15 interstitial defects in iron. Our uncertainty quantification scheme allows for accurate modeling of the evolution of these defects over timescales of several seconds.

  13. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-12-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  14. Stand-alone core sensitivity and uncertainty analysis of ALFRED from Monte Carlo simulations

    International Nuclear Information System (INIS)

    Pérez-Valseca, A.-D.; Espinosa-Paredes, G.; François, J.L.; Vázquez Rodríguez, A.; Martín-del-Campo, C.

    2017-01-01

    Highlights: • Methodology based on Monte Carlo simulation. • Sensitivity analysis of Lead Fast Reactor (LFR). • Uncertainty and regression analysis of LFR. • 10% change in the core inlet flow, the response in thermal power change is 0.58%. • 2.5% change in the inlet lead temperature the response is 1.87% in power. - Abstract: The aim of this paper is the sensitivity and uncertainty analysis of a Lead-Cooled Fast Reactor (LFR) based on Monte Carlo simulation of sizes up to 2000. The methodology developed in this work considers the uncertainty of sensitivities and uncertainty of output variables due to a single-input-variable variation. The Advanced Lead fast Reactor European Demonstrator (ALFRED) is analyzed to determine the behavior of the essential parameters due to effects of mass flow and temperature of liquid lead. The ALFRED core mathematical model developed in this work is fully transient, which takes into account the heat transfer in an annular fuel pellet design, the thermo-fluid in the core, and the neutronic processes, which are modeled with point kinetic with feedback fuel temperature and expansion effects. The sensitivity evaluated in terms of the relative standard deviation (RSD) showed that for 10% change in the core inlet flow, the response in thermal power change is 0.58%, and for 2.5% change in the inlet lead temperature is 1.87%. The regression analysis with mass flow rate as the predictor variable showed statistically valid cubic correlations for neutron flux and linear relationship neutron flux as a function of the lead temperature. No statistically valid correlation was observed for the reactivity as a function of the mass flow rate and for the lead temperature. These correlations are useful for the study, analysis, and design of any LFR.

  15. Numerical simulation of runoff from extreme rainfall events in a mountain water catchment

    Directory of Open Access Journals (Sweden)

    J. Burguete

    2002-01-01

    Full Text Available A numerical model for unsteady shallow water flow over initially dry areas is applied to a case study in a small drainage area at the Spanish Ebro River basin. Several flood mitigation measures (reforestation, construction of a small reservoir and channelization are simulated in the model in order to compare different extreme rainfall-runoff scenarios.

  16. Latin hypercube sampling and geostatistical modeling of spatial uncertainty in a spatially explicit forest landscape model simulation

    Science.gov (United States)

    Chonggang Xu; Hong S. He; Yuanman Hu; Yu Chang; Xiuzhen Li; Rencang Bu

    2005-01-01

    Geostatistical stochastic simulation is always combined with Monte Carlo method to quantify the uncertainty in spatial model simulations. However, due to the relatively long running time of spatially explicit forest models as a result of their complexity, it is always infeasible to generate hundreds or thousands of Monte Carlo simulations. Thus, it is of great...

  17. Simulation of corn yields and parameters uncertainties analysis in Hebei and Sichuang, China

    Science.gov (United States)

    Fu, A.; Xue, Y.; Hartman, M. D.; Chandran, A.; Qiu, B.; Liu, Y.

    2016-12-01

    Corn is one of most important agricultural production in China. Research on the impacts of climate change and human activities on corn yields is important in understanding and mitigating the negative effects of environmental factors on corn yields and maintaining the stable corn production. Using climatic data, including daily temperature, precipitation, and solar radiation from 1948 to 2010, soil properties, observed corn yields, and farmland management information, corn yields in Sichuang and Hebei Provinces of China in the past 63 years were simulated using the Daycent model, and the results was evaluated using Root mean square errors, bias, simulation efficiency, and standard deviation. The primary climatic factors influencing corn yields were examined, the uncertainties of climatic factors was analyzed, and the uncertainties of human activity parameters were also studied by changing fertilization levels and cultivated ways. The results showed that: (1) Daycent model is capable to simulate corn yields in Sichuang and Hebei provinces of China. Observed and simulated corn yields have the similar increasing trend with time. (2) The minimum daily temperature is the primary factor influencing corn yields in Sichuang. In Hebei Province, daily temperature, precipitation and wind speed significantly affect corn yields.(3) When the global warming trend of original data was removed, simulated corn yields were lower than before, decreased by about 687 kg/hm2 from 1992 to 2010; When the fertilization levels, cultivated ways were increased and decreased by 50% and 75%, respectively in the Schedule file in Daycent model, the simulated corn yields increased by 1206 kg/hm2 and 776 kg/hm2, respectively, with the enhancement of fertilization level and the improvement of cultivated way. This study provides a scientific base for selecting a suitable fertilization level and cultivated way in corn fields in China.

  18. Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations

    International Nuclear Information System (INIS)

    Shaukata, Nadeem; Shim, Hyung Jin

    2015-01-01

    In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of

  19. Uncertainty Propagation Analysis for the Monte Carlo Time-Dependent Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Shaukata, Nadeem; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of)

    2015-10-15

    In this paper, a conventional method to control the neutron population for super-critical systems is implemented. Instead of considering the cycles, the simulation is divided in time intervals. At the end of each time interval, neutron population control is applied on the banked neutrons. Randomly selected neutrons are discarded, until the size of neutron population matches the initial neutron histories at the beginning of time simulation. A time-dependent simulation mode has also been implemented in the development version of SERPENT 2 Monte Carlo code. In this mode, sequential population control mechanism has been proposed for modeling of prompt super-critical systems. A Monte Carlo method has been properly used in TART code for dynamic criticality calculations. For super-critical systems, the neutron population is allowed to grow over a period of time. The neutron population is uniformly combed to return it to the neutron population started with at the beginning of time boundary. In this study, conventional time-dependent Monte Carlo (TDMC) algorithm is implemented. There is an exponential growth of neutron population in estimation of neutron density tally for super-critical systems and the number of neutrons being tracked exceed the memory of the computer. In order to control this exponential growth at the end of each time boundary, a conventional time cut-off controlling population strategy is included in TDMC. A scale factor is introduced to tally the desired neutron density at the end of each time boundary. The main purpose of this paper is the quantification of uncertainty propagation in neutron densities at the end of each time boundary for super-critical systems. This uncertainty is caused by the uncertainty resulting from the introduction of scale factor. The effectiveness of TDMC is examined for one-group infinite homogeneous problem (the rod model) and two-group infinite homogeneous problem. The desired neutron density is tallied by the introduction of

  20. Arctic daily temperature and precipitation extremes: Observed and simulated physical behavior

    Science.gov (United States)

    Glisan, Justin Michael

    Simulations using a six-member ensemble of Pan-Arctic WRF (PAW) were produced on two Arctic domains with 50-km resolution to analyze precipitation and temperature extremes for various periods. The first study used a domain developed for the Regional Arctic Climate Model (RACM). Initial simulations revealed deep atmospheric circulation biases over the northern Pacific Ocean, manifested in pressure, geopotential height, and temperature fields. Possible remedies to correct these large biases, such as modifying the physical domain or using different initial/boundary conditions, were unsuccessful. Spectral (interior) nudging was introduced as a way of constraining the model to be more consistent with observed behavior. However, such control over numerical model behavior raises concerns over how much nudging may affect unforced variability and extremes. Strong nudging may reduce or filter out extreme events, since the nudging pushes the model toward a relatively smooth, large-scale state. The question then becomes---what is the minimum spectral nudging needed to correct biases while not limiting the simulation of extreme events? To determine this, we use varying degrees of spectral nudging, using WRF's standard nudging as a reference point during January and July 2007. Results suggest that there is a marked lack of sensitivity to varying degrees of nudging. Moreover, given that nudging is an artificial forcing applied in the model, an important outcome of this work is that nudging strength apparently can be considerably smaller than WRF's standard strength and still produce reliable simulations. In the remaining studies, we used the same PAW setup to analyze daily precipitation extremes simulated over a 19-year period on the CORDEX Arctic domain for winter and summer. We defined these seasons as the three-month period leading up to and including the climatological sea ice maximum and minimum, respectively. Analysis focused on four North American regions defined using

  1. An overview of uncertainty quantification techniques with application to oceanic and oil-spill simulations

    KAUST Repository

    Iskandarani, Mohamed; Wang, Shitao; Srinivasan, Ashwanth; Carlisle Thacker, W.; Winokur, Justin; Knio, Omar

    2016-01-01

    We give an overview of four different ensemble-based techniques for uncertainty quantification and illustrate their application in the context of oil plume simulations. These techniques share the common paradigm of constructing a model proxy that efficiently captures the functional dependence of the model output on uncertain model inputs. This proxy is then used to explore the space of uncertain inputs using a large number of samples, so that reliable estimates of the model's output statistics can be calculated. Three of these techniques use polynomial chaos (PC) expansions to construct the model proxy, but they differ in their approach to determining the expansions' coefficients; the fourth technique uses Gaussian Process Regression (GPR). An integral plume model for simulating the Deepwater Horizon oil-gas blowout provides examples for illustrating the different techniques. A Monte Carlo ensemble of 50,000 model simulations is used for gauging the performance of the different proxies. The examples illustrate how regression-based techniques can outperform projection-based techniques when the model output is noisy. They also demonstrate that robust uncertainty analysis can be performed at a fraction of the cost of the Monte Carlo calculation.

  2. An overview of uncertainty quantification techniques with application to oceanic and oil-spill simulations

    KAUST Repository

    Iskandarani, Mohamed

    2016-04-22

    We give an overview of four different ensemble-based techniques for uncertainty quantification and illustrate their application in the context of oil plume simulations. These techniques share the common paradigm of constructing a model proxy that efficiently captures the functional dependence of the model output on uncertain model inputs. This proxy is then used to explore the space of uncertain inputs using a large number of samples, so that reliable estimates of the model\\'s output statistics can be calculated. Three of these techniques use polynomial chaos (PC) expansions to construct the model proxy, but they differ in their approach to determining the expansions\\' coefficients; the fourth technique uses Gaussian Process Regression (GPR). An integral plume model for simulating the Deepwater Horizon oil-gas blowout provides examples for illustrating the different techniques. A Monte Carlo ensemble of 50,000 model simulations is used for gauging the performance of the different proxies. The examples illustrate how regression-based techniques can outperform projection-based techniques when the model output is noisy. They also demonstrate that robust uncertainty analysis can be performed at a fraction of the cost of the Monte Carlo calculation.

  3. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    Energy Technology Data Exchange (ETDEWEB)

    Wilke, Jeremiah J [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Kenny, Joseph P. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading framework allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.

  4. Efficient uncertainty quantification in fully-integrated surface and subsurface hydrologic simulations

    Science.gov (United States)

    Miller, K. L.; Berg, S. J.; Davison, J. H.; Sudicky, E. A.; Forsyth, P. A.

    2018-01-01

    Although high performance computers and advanced numerical methods have made the application of fully-integrated surface and subsurface flow and transport models such as HydroGeoSphere common place, run times for large complex basin models can still be on the order of days to weeks, thus, limiting the usefulness of traditional workhorse algorithms for uncertainty quantification (UQ) such as Latin Hypercube simulation (LHS) or Monte Carlo simulation (MCS), which generally require thousands of simulations to achieve an acceptable level of accuracy. In this paper we investigate non-intrusive polynomial chaos for uncertainty quantification, which in contrast to random sampling methods (e.g., LHS and MCS), represents a model response of interest as a weighted sum of polynomials over the random inputs. Once a chaos expansion has been constructed, approximating the mean, covariance, probability density function, cumulative distribution function, and other common statistics as well as local and global sensitivity measures is straightforward and computationally inexpensive, thus making PCE an attractive UQ method for hydrologic models with long run times. Our polynomial chaos implementation was validated through comparison with analytical solutions as well as solutions obtained via LHS for simple numerical problems. It was then used to quantify parametric uncertainty in a series of numerical problems with increasing complexity, including a two-dimensional fully-saturated, steady flow and transient transport problem with six uncertain parameters and one quantity of interest; a one-dimensional variably-saturated column test involving transient flow and transport, four uncertain parameters, and two quantities of interest at 101 spatial locations and five different times each (1010 total); and a three-dimensional fully-integrated surface and subsurface flow and transport problem for a small test catchment involving seven uncertain parameters and three quantities of interest at

  5. Adaptation to hydrological extremes through insurance: a financial fund simulation model under changing scenarios

    Science.gov (United States)

    Guzman, Diego; Mohor, Guilherme; Câmara, Clarissa; Mendiondo, Eduardo

    2017-04-01

    Researches from around the world relate global environmental changes with the increase of vulnerability to extreme events, such as heavy and scarce precipitations - floods and droughts. Hydrological disasters have caused increasing losses in recent years. Thus, risk transfer mechanisms, such as insurance, are being implemented to mitigate impacts, finance the recovery of the affected population, and promote the reduction of hydrological risks. However, among the main problems in implementing these strategies, there are: First, the partial knowledge of natural and anthropogenic climate change in terms of intensity and frequency; Second, the efficient risk reduction policies require accurate risk assessment, with careful consideration of costs; Third, the uncertainty associated with numerical models and input data used. The objective of this document is to introduce and discuss the feasibility of the application of Hydrological Risk Transfer Models (HRTMs) as a strategy of adaptation to global climate change. The article shows the development of a methodology for the collective and multi-sectoral vulnerability management, facing the hydrological risk in the long term, under an insurance funds simulator. The methodology estimates the optimized premium as a function of willingness to pay (WTP) and the potential direct loss derived from hydrological risk. The proposed methodology structures the watershed insurance scheme in three analysis modules. First, the hazard module, which characterizes the hydrologic threat from the recorded series input or modelled series under IPCC / RCM's generated scenarios. Second, the vulnerability module calculates the potential economic loss for each sector1 evaluated as a function of the return period "TR". Finally, the finance module determines the value of the optimal aggregate premium by evaluating equiprobable scenarios of water vulnerability; taking into account variables such as the maximum limit of coverage, deductible

  6. Uncertainty analysis of atmospheric deposition simulation of radiocesium and radioiodine from Fukushima Daiichi Nuclear Power Plant

    Science.gov (United States)

    Morino, Yu; Ohara, Toshimasa; Yumimoto, Keiya

    2014-05-01

    Chemical transport models (CTM) played key roles in understanding the atmospheric behaviors and deposition patterns of radioactive materials emitted from the Fukushima Daiichi nuclear power plant (FDNPP) after the nuclear accident that accompanied the great Tohoku earthquake and tsunami on 11 March 2011. In this study, we assessed uncertainties of atmospheric simulation by comparing observed and simulated deposition of radiocesium (137Cs) and radioiodine (131I). Airborne monitoring survey data were used to assess the model performance of 137Cs deposition patterns. We found that simulation using emissions estimated with a regional-scale (~500 km) CTM better reproduced the observed 137Cs deposition pattern in eastern Japan than simulation using emissions estimated with local-scale (~50 km) or global-scale CTM. In addition, we estimated the emission amount of 137Cs from FDNPP by combining a CTM, a priori source term, and observed deposition data. This is the first use of airborne survey data of 137Cs deposition (more than 16,000 data points) as the observational constraints in inverse modeling. The model simulation driven by a posteriori source term achieved better agreements with 137Cs depositions measured by aircraft survey and at in-situ stations over eastern Japan. Wet deposition module was also evaluated. Simulation using a process-based wet deposition module reproduced the observations well, whereas simulation using scavenging coefficients showed large uncertainties associated with empirical parameters. The best-available simulation reproduced the observed 137Cs deposition rates in high-deposition areas (≥10 kBq m-2) within one order of magnitude. Recently, 131I deposition map was released and helped to evaluate model performance of 131I deposition patterns. Observed 131I/137Cs deposition ratio is higher in areas southwest of FDNPP than northwest of FDNPP, and this behavior was roughly reproduced by a CTM if we assume that released 131I is more in gas phase

  7. A framework to quantify uncertainty in simulations of oil transport in the ocean

    KAUST Repository

    Gonçalves, Rafael C.

    2016-03-02

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model\\'s output to be presented in a probabilistic framework so that the model\\'s predictions reflect the uncertainty in the model\\'s input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model\\'s uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.

  8. Analyzing climate change impacts on water resources under uncertainty using an integrated simulation-optimization approach

    Science.gov (United States)

    Zhuang, X. W.; Li, Y. P.; Nie, S.; Fan, Y. R.; Huang, G. H.

    2018-01-01

    An integrated simulation-optimization (ISO) approach is developed for assessing climate change impacts on water resources. In the ISO, uncertainties presented as both interval numbers and probability distributions can be reflected. Moreover, ISO permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised water-allocation targets are violated. A snowmelt-precipitation-driven watershed (Kaidu watershed) in northwest China is selected as the study case for demonstrating the applicability of the proposed method. Results of meteorological projections disclose that the incremental trend of temperature (e.g., minimum and maximum values) and precipitation exist. Results also reveal that (i) the system uncertainties would significantly affect water resources allocation pattern (including target and shortage); (ii) water shortage would be enhanced from 2016 to 2070; and (iii) the more the inflow amount decreases, the higher estimated water shortage rates are. The ISO method is useful for evaluating climate change impacts within a watershed system with complicated uncertainties and helping identify appropriate water resources management strategies hedging against drought.

  9. Application of adaptive hierarchical sparse grid collocation to the uncertainty quantification of nuclear reactor simulators

    Energy Technology Data Exchange (ETDEWEB)

    Yankov, A.; Downar, T. [University of Michigan, 2355 Bonisteel Blvd, Ann Arbor, MI 48109 (United States)

    2013-07-01

    Recent efforts in the application of uncertainty quantification to nuclear systems have utilized methods based on generalized perturbation theory and stochastic sampling. While these methods have proven to be effective they both have major drawbacks that may impede further progress. A relatively new approach based on spectral elements for uncertainty quantification is applied in this paper to several problems in reactor simulation. Spectral methods based on collocation attempt to couple the approximation free nature of stochastic sampling methods with the determinism of generalized perturbation theory. The specific spectral method used in this paper employs both the Smolyak algorithm and adaptivity by using Newton-Cotes collocation points along with linear hat basis functions. Using this approach, a surrogate model for the outputs of a computer code is constructed hierarchically by adaptively refining the collocation grid until the interpolant is converged to a user-defined threshold. The method inherently fits into the framework of parallel computing and allows for the extraction of meaningful statistics and data that are not within reach of stochastic sampling and generalized perturbation theory. This paper aims to demonstrate the advantages of spectral methods-especially when compared to current methods used in reactor physics for uncertainty quantification-and to illustrate their full potential. (authors)

  10. A framework to quantify uncertainty in simulations of oil transport in the ocean

    KAUST Repository

    Gonç alves, Rafael C.; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Chassignet, Eric; Knio, Omar

    2016-01-01

    An uncertainty quantification framework is developed for the DeepC Oil Model based on a nonintrusive polynomial chaos method. This allows the model's output to be presented in a probabilistic framework so that the model's predictions reflect the uncertainty in the model's input data. The new capability is illustrated by simulating the far-field dispersal of oil in a Deepwater Horizon blowout scenario. The uncertain input consisted of ocean current and oil droplet size data and the main model output analyzed is the ensuing oil concentration in the Gulf of Mexico. A 1331 member ensemble was used to construct a surrogate for the model which was then mined for statistical information. The mean and standard deviations in the oil concentration were calculated for up to 30 days, and the total contribution of each input parameter to the model's uncertainty was quantified at different depths. Also, probability density functions of oil concentration were constructed by sampling the surrogate and used to elaborate probabilistic hazard maps of oil impact. The performance of the surrogate was constantly monitored in order to demarcate the space-time zones where its estimates are reliable. © 2016. American Geophysical Union.

  11. Bayesian integration of flux tower data into a process-based simulator for quantifying uncertainty in simulated output

    Science.gov (United States)

    Raj, Rahul; van der Tol, Christiaan; Hamm, Nicholas Alexander Samuel; Stein, Alfred

    2018-01-01

    Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP) data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT), ratio of fine root carbon to leaf carbon (FRC : LC), ratio of carbon to nitrogen in leaf (C : Nleaf), canopy water interception coefficient (Wint), fraction of leaf nitrogen in RuBisCO (FLNR), and effective soil rooting depth (SD) characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash-Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.

  12. Bayesian integration of flux tower data into a process-based simulator for quantifying uncertainty in simulated output

    Directory of Open Access Journals (Sweden)

    R. Raj

    2018-01-01

    Full Text Available Parameters of a process-based forest growth simulator are difficult or impossible to obtain from field observations. Reliable estimates can be obtained using calibration against observations of output and state variables. In this study, we present a Bayesian framework to calibrate the widely used process-based simulator Biome-BGC against estimates of gross primary production (GPP data. We used GPP partitioned from flux tower measurements of a net ecosystem exchange over a 55-year-old Douglas fir stand as an example. The uncertainties of both the Biome-BGC parameters and the simulated GPP values were estimated. The calibrated parameters leaf and fine root turnover (LFRT, ratio of fine root carbon to leaf carbon (FRC : LC, ratio of carbon to nitrogen in leaf (C : Nleaf, canopy water interception coefficient (Wint, fraction of leaf nitrogen in RuBisCO (FLNR, and effective soil rooting depth (SD characterize the photosynthesis and carbon and nitrogen allocation in the forest. The calibration improved the root mean square error and enhanced Nash–Sutcliffe efficiency between simulated and flux tower daily GPP compared to the uncalibrated Biome-BGC. Nevertheless, the seasonal cycle for flux tower GPP was not reproduced exactly and some overestimation in spring and underestimation in summer remained after calibration. We hypothesized that the phenology exhibited a seasonal cycle that was not accurately reproduced by the simulator. We investigated this by calibrating the Biome-BGC to each month's flux tower GPP separately. As expected, the simulated GPP improved, but the calibrated parameter values suggested that the seasonal cycle of state variables in the simulator could be improved. It was concluded that the Bayesian framework for calibration can reveal features of the modelled physical processes and identify aspects of the process simulator that are too rigid.

  13. Global characteristics of extreme winters from a multi-millennial simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, B.G. [CSIRO Marine and Atmospheric Research, PO Box 1, Aspendale (Australia)

    2011-10-15

    Output from a multi-millennial simulation with the CSIRO Mark 2 coupled global climatic model has been analysed to determine the principal characteristics of extreme winters over the globe for ''present conditions''. Thus, this study is not concerned with possible changes in winter conditions associated with anthropogenically induced climatic change. Defining an extreme winter as having a surface temperature anomaly of below -2 standard deviations (sd) revealed a general occurrence rate over the globe of between 100 and 200 over a 6,000-year period of the simulation, with somewhat higher values over northwest North America. For temperature anomalies below -3 sd the corresponding occurrence rate drops to about 10. Spatial correlation studies revealed that extreme winters over regions in Europe, North America or Asia were very limited geographically, with time series of the surface temperature anomalies for these regions having mutual correlation coefficients of about 0.2. The temporal occurrence rates of winters (summers) having sd below -3 (above +3) were very asymmetric and sporadic, suggesting that such events arise from stochastic influences. Multi-year sequences of extreme winters were comparatively rare events. Detailed analysis revealed that the temporal and spatial evolution of the monthly surface temperature anomalies associated with an individual extreme winter were well replicated in the simulation, as were daily time series of such anomalies. Apart from an influence of the North Atlantic Oscillation on extreme winters in Europe, other prominent climatic oscillations were very poorly correlated with such winters. Rather modest winter temperature anomalies were found in the southern hemisphere. (orig.)

  14. Attribution of Extreme Rainfall Events in the South of France Using EURO-CORDEX Simulations

    Science.gov (United States)

    Luu, L. N.; Vautard, R.; Yiou, P.

    2017-12-01

    The Mediterranean region regularly undergoes episodes of intense precipitation in the fall season that exceed 300mm a day. This study focuses on the role of climate change on the dynamics of the events that occur in the South of France. We used an ensemble of 10 EURO-CORDEX model simulations with two horizontal resolutions (EUR-11: 0.11° and EUR-44: 0.44°) for the attribution of extreme rainfall in the fall in the Cevennes mountain range (South of France). The biases of the simulations were corrected with simple scaling adjustment and a quantile correction (CDFt). This produces five datasets including EUR-44 and EUR-11 with and without scaling adjustment and CDFt-EUR-11, on which we test the impact of resolution and bias correction on the extremes. Those datasets, after pooling all of models together, are fitted by a stationary Generalized Extreme Value distribution for several periods to estimate a climate change signal in the tail of distribution of extreme rainfall in the Cévenne region. Those changes are then interpreted by a scaling model that links extreme rainfall with mean and maximum daily temperature. The results show that higher-resolution simulations with bias adjustment provide a robust and confident increase of intensity and likelihood of occurrence of autumn extreme rainfall in the area in current climate in comparison with historical climate. The probability (exceedance probability) of 1-in-1000-year event in historical climate may increase by a factor of 1.8 under current climate with a confident interval of 0.4 to 5.3 following the CDFt bias-adjusted EUR-11. The change of magnitude appears to follow the Clausius-Clapeyron relation that indicates a 7% increase in rainfall per 1oC increase in temperature.

  15. The influence of wheelchair propulsion technique on upper extremity muscle demand: a simulation study.

    Science.gov (United States)

    Rankin, Jeffery W; Kwarciak, Andrew M; Richter, W Mark; Neptune, Richard R

    2012-11-01

    The majority of manual wheelchair users will experience upper extremity injuries or pain, in part due to the high force requirements, repetitive motion and extreme joint postures associated with wheelchair propulsion. Recent studies have identified cadence, contact angle and peak force as important factors for reducing upper extremity demand during propulsion. However, studies often make comparisons between populations (e.g., able-bodied vs. paraplegic) or do not investigate specific measures of upper extremity demand. The purpose of this study was to use a musculoskeletal model and forward dynamics simulations of wheelchair propulsion to investigate how altering cadence, peak force and contact angle influence individual muscle demand. Forward dynamics simulations of wheelchair propulsion were generated to emulate group-averaged experimental data during four conditions: 1) self-selected propulsion technique, and while 2) minimizing cadence, 3) maximizing contact angle, and 4) minimizing peak force using biofeedback. Simulations were used to determine individual muscle mechanical power and stress as measures of muscle demand. Minimizing peak force and cadence had the lowest muscle power requirements. However, minimizing peak force increased cadence and recovery power, while minimizing cadence increased average muscle stress. Maximizing contact angle increased muscle stress and had the highest muscle power requirements. Minimizing cadence appears to have the most potential for reducing muscle demand and fatigue, which could decrease upper extremity injuries and pain. However, altering any of these variables to extreme values appears to be less effective; instead small to moderate changes may better reduce overall muscle demand. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Future changes in Asian summer monsoon precipitation extremes as inferred from 20-km AGCM simulations

    Science.gov (United States)

    Lui, Yuk Sing; Tam, Chi-Yung; Lau, Ngar-Cheung

    2018-04-01

    This study examines the impacts of climate change on precipitation extremes in the Asian monsoon region during boreal summer, based on simulations from the 20-km Meteorological Research Institute atmospheric general circulation model. The model can capture the summertime monsoon rainfall, with characteristics similar to those from Tropical Rainfall Measuring Mission and Asian Precipitation-Highly-Resolved Observational Data Integration Towards Evaluation. By comparing the 2075-2099 with the present-day climate simulations, there is a robust increase of the mean rainfall in many locations due to a warmer climate. Over southeastern China, the Baiu rainband, Bay of Bengal and central India, extreme precipitation rates are also enhanced in the future, which can be inferred from increases of the 95th percentile of daily precipitation, the maximum accumulated precipitation in 5 consecutive days, the simple daily precipitation intensity index, and the scale parameter of the fitted gamma distribution. In these regions, with the exception of the Baiu rainband, most of these metrics give a fractional change of extreme rainfall per degree increase of the lower-tropospheric temperature of 5 to 8.5% K-1, roughly consistent with the Clausius-Clapeyron relation. However, over the Baiu area extreme precipitation change scales as 3.5% K-1 only. We have also stratified the rainfall data into those associated with tropical cyclones (TC) and those with other weather systems. The AGCM gives an increase of the accumulated TC rainfall over southeastern China, and a decrease in southern Japan in the future climate. The latter can be attributed to suppressed TC occurrence in southern Japan, whereas increased accumulated rainfall over southeastern China is due to more intense TC rain rate under global warming. Overall, non-TC weather systems are the main contributor to enhanced precipitation extremes in various locations. In the future, TC activities over southeastern China tend to further

  17. Stochastic simulation experiment to assess radar rainfall retrieval uncertainties associated with attenuation and its correction

    Directory of Open Access Journals (Sweden)

    R. Uijlenhoet

    2008-03-01

    Full Text Available As rainfall constitutes the main source of water for the terrestrial hydrological processes, accurate and reliable measurement and prediction of its spatial and temporal distribution over a wide range of scales is an important goal for hydrology. We investigate the potential of ground-based weather radar to provide such measurements through a theoretical analysis of some of the associated observation uncertainties. A stochastic model of range profiles of raindrop size distributions is employed in a Monte Carlo simulation experiment to investigate the rainfall retrieval uncertainties associated with weather radars operating at X-, C-, and S-band. We focus in particular on the errors and uncertainties associated with rain-induced signal attenuation and its correction for incoherent, non-polarimetric, single-frequency, operational weather radars. The performance of two attenuation correction schemes, the (forward Hitschfeld-Bordan algorithm and the (backward Marzoug-Amayenc algorithm, is analyzed for both moderate (assuming a 50 km path length and intense Mediterranean rainfall (for a 30 km path. A comparison shows that the backward correction algorithm is more stable and accurate than the forward algorithm (with a bias in the order of a few percent for the former, compared to tens of percent for the latter, provided reliable estimates of the total path-integrated attenuation are available. Moreover, the bias and root mean square error associated with each algorithm are quantified as a function of path-averaged rain rate and distance from the radar in order to provide a plausible order of magnitude for the uncertainty in radar-retrieved rain rates for hydrological applications.

  18. Effects of Input Data Content on the Uncertainty of Simulating Water Resources

    Directory of Open Access Journals (Sweden)

    Carla Camargos

    2018-05-01

    Full Text Available The widely used, partly-deterministic Soil and Water Assessment Tool (SWAT requires a large amount of spatial input data, such as a digital elevation model (DEM, land use, and soil maps. Modelers make an effort to apply the most specific data possible for the study area to reflect the heterogeneous characteristics of landscapes. Regional data, especially with fine resolution, is often preferred. However, such data is not always available and can be computationally demanding. Despite being coarser, global data are usually free and available to the public. Previous studies revealed the importance for single investigations of different input maps. However, it remains unknown whether higher-resolution data can lead to reliable results. This study investigates how global and regional input datasets affect parameter uncertainty when estimating river discharges. We analyze eight different setups for the SWAT model for a catchment in Luxembourg, combining different land-use, elevation, and soil input data. The Metropolis–Hasting Markov Chain Monte Carlo (MCMC algorithm is used to infer posterior model parameter uncertainty. We conclude that our higher resolved DEM improves the general model performance in reproducing low flows by 10%. The less detailed soil-map improved the fit of low flows by 25%. In addition, more detailed land-use maps reduce the bias of the model discharge simulations by 50%. Also, despite presenting similar parameter uncertainty (P-factor ranging from 0.34 to 0.41 and R-factor from 0.41 to 0.45 for all setups, the results show a disparate parameter posterior distribution. This indicates that no assessment of all sources of uncertainty simultaneously is compensated by the fitted parameter values. We conclude that our result can give some guidance for future SWAT applications in the selection of the degree of detail for input data.

  19. RELAP5 simulation of surge line break accident using combined and best estimate plus uncertainty approaches

    International Nuclear Information System (INIS)

    Kristof, Marian; Kliment, Tomas; Petruzzi, Alessandro; Lipka, Jozef

    2009-01-01

    Licensing calculations in a majority of countries worldwide still rely on the application of combined approach using best estimate computer code without evaluation of the code models uncertainty and conservative assumptions on initial and boundary, availability of systems and components and additional conservative assumptions. However best estimate plus uncertainty (BEPU) approach representing the state-of-the-art in the area of safety analysis has a clear potential to replace currently used combined approach. There are several applications of BEPU approach in the area of licensing calculations, but some questions are discussed, namely from the regulatory point of view. In order to find a proper solution to these questions and to support the BEPU approach to become a standard approach for licensing calculations, a broad comparison of both approaches for various transients is necessary. Results of one of such comparisons on the example of the VVER-440/213 NPP pressurizer surge line break event are described in this paper. A Kv-scaled simulation based on PH4-SLB experiment from PMK-2 integral test facility applying its volume and power scaling factor is performed for qualitative assessment of the RELAP5 computer code calculation using the VVER-440/213 plant model. Existing hardware differences are identified and explained. The CIAU method is adopted for performing the uncertainty evaluation. Results using combined and BEPU approaches are in agreement with the experimental values in PMK-2 facility. Only minimal difference between combined and BEPU approached has been observed in the evaluation of the safety margins for the peak cladding temperature. Benefits of the CIAU uncertainty method are highlighted.

  20. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    Science.gov (United States)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  1. muView: A Visual Analysis System for Exploring Uncertainty in Myocardial Ischemia Simulations

    KAUST Repository

    Rosen, Paul

    2016-05-23

    In this paper we describe the Myocardial Uncertainty Viewer (muView or μView) system for exploring data stemming from the simulation of cardiac ischemia. The simulation uses a collection of conductivity values to understand how ischemic regions effect the undamaged anisotropic heart tissue. The data resulting from the simulation is multi-valued and volumetric, and thus, for every data point, we have a collection of samples describing cardiac electrical properties. μView combines a suite of visual analysis methods to explore the area surrounding the ischemic zone and identify how perturbations of variables change the propagation of their effects. In addition to presenting a collection of visualization techniques, which individually highlight different aspects of the data, the coordinated view system forms a cohesive environment for exploring the simulations. We also discuss the findings of our study, which are helping to steer further development of the simulation and strengthening our collaboration with the biomedical engineers attempting to understand the phenomenon.

  2. muView: A Visual Analysis System for Exploring Uncertainty in Myocardial Ischemia Simulations

    KAUST Repository

    Rosen, Paul; Burton, Brett; Potter, Kristin; Johnson, Chris R.

    2016-01-01

    In this paper we describe the Myocardial Uncertainty Viewer (muView or μView) system for exploring data stemming from the simulation of cardiac ischemia. The simulation uses a collection of conductivity values to understand how ischemic regions effect the undamaged anisotropic heart tissue. The data resulting from the simulation is multi-valued and volumetric, and thus, for every data point, we have a collection of samples describing cardiac electrical properties. μView combines a suite of visual analysis methods to explore the area surrounding the ischemic zone and identify how perturbations of variables change the propagation of their effects. In addition to presenting a collection of visualization techniques, which individually highlight different aspects of the data, the coordinated view system forms a cohesive environment for exploring the simulations. We also discuss the findings of our study, which are helping to steer further development of the simulation and strengthening our collaboration with the biomedical engineers attempting to understand the phenomenon.

  3. Effects of Uncertainties in Electric Field Boundary Conditions for Ring Current Simulations

    Science.gov (United States)

    Chen, Margaret W.; O'Brien, T. Paul; Lemon, Colby L.; Guild, Timothy B.

    2018-01-01

    Physics-based simulation results can vary widely depending on the applied boundary conditions. As a first step toward assessing the effect of boundary conditions on ring current simulations, we analyze the uncertainty of cross-polar cap potentials (CPCP) on electric field boundary conditions applied to the Rice Convection Model-Equilibrium (RCM-E). The empirical Weimer model of CPCP is chosen as the reference model and Defense Meteorological Satellite Program CPCP measurements as the reference data. Using temporal correlations from a statistical analysis of the "errors" between the reference model and data, we construct a Monte Carlo CPCP discrete time series model that can be generalized to other model boundary conditions. RCM-E simulations using electric field boundary conditions from the reference model and from 20 randomly generated Monte Carlo discrete time series of CPCP are performed for two large storms. During the 10 August 2000 storm main phase, the proton density at 10 RE at midnight was observed to be low (Dst index is bounded by the simulated Dst values. In contrast, the simulated Dst values during the recovery phases of the 10 August 2000 and 31 August 2005 storms tend to underestimate systematically the observed late Dst recovery. This suggests a need to improve the accuracy of particle loss calculations in the RCM-E model. Application of this technique can aid modelers to make efficient choices on either investing more effort on improving specification of boundary conditions or on improving descriptions of physical processes.

  4. Threat and defense as goal regulation: from implicit goal conflict to anxious uncertainty, reactive approach motivation, and ideological extremism.

    Science.gov (United States)

    Nash, Kyle; McGregor, Ian; Prentice, Mike

    2011-12-01

    Four studies investigated a goal regulation view of anxious uncertainty threat (Gray & McNaughton, 2000) and ideological defense. Participants (N = 444) were randomly assigned to have achievement or relationship goals implicitly primed. The implicit goal primes were followed by randomly assigned achievement or relationship threats that have reliably caused generalized, reactive approach motivation and ideological defense in past research. The threats caused anxious uncertainty (Study 1), reactive approach motivation (Studies 2 and 3), and reactive ideological conviction (Study 4) only when threat-relevant goals had first been primed, but not when threat-irrelevant goals had first been primed. Reactive ideological conviction (Study 4) was eliminated if participants were given an opportunity to attribute their anxiety to a mundane source. Results support a goal regulation view of anxious uncertainty, threat, and defense with potential for integrating theories of defensive compensation.

  5. Using internal discharge data in a distributed conceptual model to reduce uncertainty in streamflow simulations

    Science.gov (United States)

    Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.

    2011-12-01

    Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.

  6. Uncertainty in parameterisation and model structure affect simulation results in coupled ecohydrological models

    Directory of Open Access Journals (Sweden)

    S. Arnold

    2009-10-01

    Full Text Available In this paper we develop and apply a conceptual ecohydrological model to investigate the effects of model structure and parameter uncertainty on the simulation of vegetation structure and hydrological dynamics. The model is applied for a typical water limited riparian ecosystem along an ephemeral river: the middle section of the Kuiseb River in Namibia. We modelled this system by coupling an ecological model with a conceptual hydrological model. The hydrological model is storage based with stochastical forcing from the flood. The ecosystem is modelled with a population model, and represents three dominating riparian plant populations. In appreciation of uncertainty about population dynamics, we applied three model versions with increasing complexity. Population parameters were found by Latin hypercube sampling of the parameter space and with the constraint that three species should coexist as observed. Two of the three models were able to reproduce the observed coexistence. However, both models relied on different coexistence mechanisms, and reacted differently to change of long term memory in the flood forcing. The coexistence requirement strongly constrained the parameter space for both successful models. Only very few parameter sets (0.5% of 150 000 samples allowed for coexistence in a representative number of repeated simulations (at least 10 out of 100 and the success of the coexistence mechanism was controlled by the combination of population parameters. The ensemble statistics of average values of hydrologic variables like transpiration and depth to ground water were similar for both models, suggesting that they were mainly controlled by the applied hydrological model. The ensemble statistics of the fluctuations of depth to groundwater and transpiration, however, differed significantly, suggesting that they were controlled by the applied ecological model and coexistence mechanisms. Our study emphasizes that uncertainty about ecosystem

  7. Agent Based Simulation of Group Emotions Evolution and Strategy Intervention in Extreme Events

    Directory of Open Access Journals (Sweden)

    Bo Li

    2014-01-01

    Full Text Available Agent based simulation method has become a prominent approach in computational modeling and analysis of public emergency management in social science research. The group emotions evolution, information diffusion, and collective behavior selection make extreme incidents studies a complex system problem, which requires new methods for incidents management and strategy evaluation. This paper studies the group emotion evolution and intervention strategy effectiveness using agent based simulation method. By employing a computational experimentation methodology, we construct the group emotion evolution as a complex system and test the effects of three strategies. In addition, the events-chain model is proposed to model the accumulation influence of the temporal successive events. Each strategy is examined through three simulation experiments, including two make-up scenarios and a real case study. We show how various strategies could impact the group emotion evolution in terms of the complex emergence and emotion accumulation influence in extreme events. This paper also provides an effective method of how to use agent-based simulation for the study of complex collective behavior evolution problem in extreme incidents, emergency, and security study domains.

  8. Projections of Future Precipitation Extremes Over Europe: A Multimodel Assessment of Climate Simulations

    Science.gov (United States)

    Rajczak, Jan; Schär, Christoph

    2017-10-01

    Projections of precipitation and its extremes over the European continent are analyzed in an extensive multimodel ensemble of 12 and 50 km resolution EURO-CORDEX Regional Climate Models (RCMs) forced by RCP2.6, RCP4.5, and RCP8.5 (Representative Concentration Pathway) aerosol and greenhouse gas emission scenarios. A systematic intercomparison with ENSEMBLES RCMs is carried out, such that in total information is provided for an unprecedentedly large data set of 100 RCM simulations. An evaluation finds very reasonable skill for the EURO-CORDEX models in simulating temporal and geographical variations of (mean and heavy) precipitation at both horizontal resolutions. Heavy and extreme precipitation events are projected to intensify across most of Europe throughout the whole year. All considered models agree on a distinct intensification of extremes by often more than +20% in winter and fall and over central and northern Europe. A reduction of rainy days and mean precipitation in summer is simulated by a large majority of models in the Mediterranean area, but intermodel spread between the simulations is large. In central Europe and France during summer, models project decreases in precipitation but more intense heavy and extreme rainfalls. Comparison to previous RCM projections from ENSEMBLES reveals consistency but slight differences in summer, where reductions in southern European precipitation are not as pronounced as previously projected. The projected changes of the European hydrological cycle may have substantial impact on environmental and anthropogenic systems. In particular, the simulations indicate a rising probability of summertime drought in southern Europe and more frequent and intense heavy rainfall across all of Europe.

  9. Coastal aquifer management under parameter uncertainty: Ensemble surrogate modeling based simulation-optimization

    Science.gov (United States)

    Janardhanan, S.; Datta, B.

    2011-12-01

    Surrogate models are widely used to develop computationally efficient simulation-optimization models to solve complex groundwater management problems. Artificial intelligence based models are most often used for this purpose where they are trained using predictor-predictand data obtained from a numerical simulation model. Most often this is implemented with the assumption that the parameters and boundary conditions used in the numerical simulation model are perfectly known. However, in most practical situations these values are uncertain. Under these circumstances the application of such approximation surrogates becomes limited. In our study we develop a surrogate model based coupled simulation optimization methodology for determining optimal pumping strategies for coastal aquifers considering parameter uncertainty. An ensemble surrogate modeling approach is used along with multiple realization optimization. The methodology is used to solve a multi-objective coastal aquifer management problem considering two conflicting objectives. Hydraulic conductivity and the aquifer recharge are considered as uncertain values. Three dimensional coupled flow and transport simulation model FEMWATER is used to simulate the aquifer responses for a number of scenarios corresponding to Latin hypercube samples of pumping and uncertain parameters to generate input-output patterns for training the surrogate models. Non-parametric bootstrap sampling of this original data set is used to generate multiple data sets which belong to different regions in the multi-dimensional decision and parameter space. These data sets are used to train and test multiple surrogate models based on genetic programming. The ensemble of surrogate models is then linked to a multi-objective genetic algorithm to solve the pumping optimization problem. Two conflicting objectives, viz, maximizing total pumping from beneficial wells and minimizing the total pumping from barrier wells for hydraulic control of

  10. Modeling extreme "Carrington-type" space weather events using three-dimensional global MHD simulations

    Science.gov (United States)

    Ngwira, Chigomezyo M.; Pulkkinen, Antti; Kuznetsova, Maria M.; Glocer, Alex

    2014-06-01

    There is a growing concern over possible severe societal consequences related to adverse space weather impacts on man-made technological infrastructure. In the last two decades, significant progress has been made toward the first-principles modeling of space weather events, and three-dimensional (3-D) global magnetohydrodynamics (MHD) models have been at the forefront of this transition, thereby playing a critical role in advancing our understanding of space weather. However, the modeling of extreme space weather events is still a major challenge even for the modern global MHD models. In this study, we introduce a specially adapted University of Michigan 3-D global MHD model for simulating extreme space weather events with a Dst footprint comparable to the Carrington superstorm of September 1859 based on the estimate by Tsurutani et. al. (2003). Results are presented for a simulation run with "very extreme" constructed/idealized solar wind boundary conditions driving the magnetosphere. In particular, we describe the reaction of the magnetosphere-ionosphere system and the associated induced geoelectric field on the ground to such extreme driving conditions. The model setup is further tested using input data for an observed space weather event of Halloween storm October 2003 to verify the MHD model consistence and to draw additional guidance for future work. This extreme space weather MHD model setup is designed specifically for practical application to the modeling of extreme geomagnetically induced electric fields, which can drive large currents in ground-based conductor systems such as power transmission grids. Therefore, our ultimate goal is to explore the level of geoelectric fields that can be induced from an assumed storm of the reported magnitude, i.e., Dst˜=-1600 nT.

  11. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    Science.gov (United States)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  12. Managing Model Data Introduced Uncertainties in Simulator Predictions for Generation IV Systems via Optimum Experimental Design

    Energy Technology Data Exchange (ETDEWEB)

    Turinsky, Paul J [North Carolina State Univ., Raleigh, NC (United States); Abdel-Khalik, Hany S [North Carolina State Univ., Raleigh, NC (United States); Stover, Tracy E [North Carolina State Univ., Raleigh, NC (United States)

    2011-03-01

    to the design concept is quantitatively determined. A technique is then established to assimilate this data and produce posteriori uncertainties on key attributes and responses of the design concept. Several experiment perturbations based on engineering judgment are used to demonstrate these methods and also serve as an initial generation of the optimization problem. Finally, an optimization technique is developed which will simultaneously arrive at an optimized experiment to produce an optimized reactor design. Solution of this problem is made possible by the use of the simulated annealing algorithm for solution of optimization problems. The optimization examined in this work is based on maximizing the reactor cost savings associated with the modified design made possible by using the design margin gained through reduced basic nuclear data uncertainties. Cost values for experiment design specifications and reactor design specifications are established and used to compute a total savings by comparing the posteriori reactor cost to the a priori cost plus the cost of the experiment. The optimized solution arrives at a maximized cost savings.

  13. Using statistical model to simulate the impact of climate change on maize yield with climate and crop uncertainties

    Science.gov (United States)

    Zhang, Yi; Zhao, Yanxia; Wang, Chunyi; Chen, Sining

    2017-11-01

    Assessment of the impact of climate change on crop productions with considering uncertainties is essential for properly identifying and decision-making agricultural practices that are sustainable. In this study, we employed 24 climate projections consisting of the combinations of eight GCMs and three emission scenarios representing the climate projections uncertainty, and two crop statistical models with 100 sets of parameters in each model representing parameter uncertainty within the crop models. The goal of this study was to evaluate the impact of climate change on maize ( Zea mays L.) yield at three locations (Benxi, Changling, and Hailun) across Northeast China (NEC) in periods 2010-2039 and 2040-2069, taking 1976-2005 as the baseline period. The multi-models ensembles method is an effective way to deal with the uncertainties. The results of ensemble simulations showed that maize yield reductions were less than 5 % in both future periods relative to the baseline. To further understand the contributions of individual sources of uncertainty, such as climate projections and crop model parameters, in ensemble yield simulations, variance decomposition was performed. The results indicated that the uncertainty from climate projections was much larger than that contributed by crop model parameters. Increased ensemble yield variance revealed the increasing uncertainty in the yield simulation in the future periods.

  14. A Study on Data Base for the Pyroprocessing Material Flow and MUF Uncertainty Simulation

    International Nuclear Information System (INIS)

    Sitompul, Yos Panagaman; Shin, Heesung; Han, Boyoung; Kim, Hodong

    2011-01-01

    The data base for the pyroprocessing material flow and MUF uncertainty simulation has been implemented well. There is no error in the data base processing and it is relatively fast by using OLEDB and MySQL. The important issue is the data base size. In OLEDB the data base size is limited to 2 Gb. To reduce the data base size, we give an option for users to filter the input nuclides based on their masses and activities. A simulation program called PYMUS has been developed to study the pyroprocessing material flow and MUF. In the program, there is a data base system that controls the data processing in the simulation. The data base system consists of input data base, data processing, and output data base. The data base system has been designed in such a way to be efficient. One example is using the OLEDB and MySQL. The data base system is explained in detail in this paper. The result shows that the data base system works well in the simulation

  15. Modeling Requirements for Simulating the Effects of Extreme Acts of Terrorism: A White Paper

    Energy Technology Data Exchange (ETDEWEB)

    Allen, M.; Hiebert-Dodd, K.; Marozas, D.; Paananen, O.; Pryor, R.J.; Reinert, R.K.

    1998-10-01

    This white paper presents the initial requirements for developing a new computer model for simulating the effects of extreme acts of terrorism in the United States. General characteristics of the model are proposed and the level of effort to prepare a complete written description of the model, prior to coding, is detailed. The model would simulate the decision processes and interactions of complex U. S. systems engaged in responding to and recovering from four types of terrorist incidents. The incident scenarios span the space of extreme acts of terrorism that have the potential to affect not only the impacted area, but also the entire nation. The model would be useful to decision-makers in assessing and analyzing the vulnerability of the nation's complex infrastructures, in prioritizing resources to reduce risk, and in planning strategies for immediate response and for subsequent recovery from terrorist incidents.

  16. Uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor

    International Nuclear Information System (INIS)

    Ghione, Alberto; Noel, Brigitte; Vinai, Paolo; Demazière, Christophe

    2017-01-01

    Highlights: • A station blackout scenario in the Jules Horowitz Reactor is analyzed using CATHARE. • Input and model uncertainties relevant to the transient, are considered. • A statistical methodology for the propagation of the uncertainties is applied. • No safety criteria are exceeded and sufficiently large safety margins are estimated. • The most influential uncertainties are determined with a sensitivity analysis. - Abstract: An uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor (JHR) is presented. The JHR is a new material testing reactor under construction at CEA on the Cadarache site, France. The thermal-hydraulic system code CATHARE is applied to investigate the response of the reactor system to the scenario. The uncertainty and sensitivity study was based on a statistical methodology for code uncertainty propagation, and the ‘Uncertainty and Sensitivity’ platform URANIE was used. Accordingly, the input uncertainties relevant to the transient, were identified, quantified, and propagated to the code output. The results show that the safety criteria are not exceeded and sufficiently large safety margins exist. In addition, the most influential input uncertainties on the safety parameters were found by making use of a sensitivity analysis.

  17. Sensitivity analysis and uncertainties simulation of the migration of radionuclide in the system of geological disposal-CRP-GEORC model

    International Nuclear Information System (INIS)

    Su Rui; Wang Ju; Chen Weiming; Zong Zihua; Zhao Honggang

    2008-01-01

    CRP-GEORC concept model is an artificial system of geological disposal for High-Level radioactive waste. Sensitivity analysis and uncertainties simulation of the migration of radionuclide Se-79 and I-129 in the far field of this system by using GoldSim Code have been conducted. It can be seen from the simulation results that variables used to describe the geological features and characterization of groundwater flow are sensitive variables of whole geological disposal system. The uncertainties of parameters have remarkable influence on the simulation results. (authors)

  18. Simulation of the catastrophic floods caused by extreme rainfall events - Uh River basin case study

    OpenAIRE

    Pekárová, Pavla; Halmová, Dana; Mitková, Veronika

    2005-01-01

    The extreme rainfall events in Central and East Europe on August 2002 rise the question, how other basins would respond on such rainfall situations. Such theorisation helps us to arrange in advance the necessary activity in the basin to reduce the consequence of the assumed disaster. The aim of the study is to recognise a reaction of the Uh River basin (Slovakia, Ukraine) to the simulated catastrophic rainfall events from August 2002. Two precipitation scenarios, sc1 and sc2, were created. Th...

  19. How model and input uncertainty impact maize yield simulations in West Africa

    Science.gov (United States)

    Waha, Katharina; Huth, Neil; Carberry, Peter; Wang, Enli

    2015-02-01

    Crop models are common tools for simulating crop yields and crop production in studies on food security and global change. Various uncertainties however exist, not only in the model design and model parameters, but also and maybe even more important in soil, climate and management input data. We analyze the performance of the point-scale crop model APSIM and the global scale crop model LPJmL with different climate and soil conditions under different agricultural management in the low-input maize-growing areas of Burkina Faso, West Africa. We test the models’ response to different levels of input information from little to detailed information on soil, climate (1961-2000) and agricultural management and compare the models’ ability to represent the observed spatial (between locations) and temporal variability (between years) in crop yields. We found that the resolution of different soil, climate and management information influences the simulated crop yields in both models. However, the difference between models is larger than between input data and larger between simulations with different climate and management information than between simulations with different soil information. The observed spatial variability can be represented well from both models even with little information on soils and management but APSIM simulates a higher variation between single locations than LPJmL. The agreement of simulated and observed temporal variability is lower due to non-climatic factors e.g. investment in agricultural research and development between 1987 and 1991 in Burkina Faso which resulted in a doubling of maize yields. The findings of our study highlight the importance of scale and model choice and show that the most detailed input data does not necessarily improve model performance.

  20. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2 O emissions.

    Science.gov (United States)

    Ehrhardt, Fiona; Soussana, Jean-François; Bellocchi, Gianni; Grace, Peter; McAuliffe, Russel; Recous, Sylvie; Sándor, Renáta; Smith, Pete; Snow, Val; de Antoni Migliorati, Massimiliano; Basso, Bruno; Bhatia, Arti; Brilli, Lorenzo; Doltra, Jordi; Dorich, Christopher D; Doro, Luca; Fitton, Nuala; Giacomini, Sandro J; Grant, Brian; Harrison, Matthew T; Jones, Stephanie K; Kirschbaum, Miko U F; Klumpp, Katja; Laville, Patricia; Léonard, Joël; Liebig, Mark; Lieffering, Mark; Martin, Raphaël; Massad, Raia S; Meier, Elizabeth; Merbold, Lutz; Moore, Andrew D; Myrgiotis, Vasileios; Newton, Paul; Pattey, Elizabeth; Rolinski, Susanne; Sharp, Joanna; Smith, Ward N; Wu, Lianhai; Zhang, Qing

    2018-02-01

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multi-species agricultural contexts. We report an international model comparison and benchmarking exercise, showing the potential of multi-model ensembles to predict productivity and nitrous oxide (N 2 O) emissions for wheat, maize, rice and temperate grasslands. Using a multi-stage modelling protocol, from blind simulations (stage 1) to partial (stages 2-4) and full calibration (stage 5), 24 process-based biogeochemical models were assessed individually or as an ensemble against long-term experimental data from four temperate grassland and five arable crop rotation sites spanning four continents. Comparisons were performed by reference to the experimental uncertainties of observed yields and N 2 O emissions. Results showed that across sites and crop/grassland types, 23%-40% of the uncalibrated individual models were within two standard deviations (SD) of observed yields, while 42 (rice) to 96% (grasslands) of the models were within 1 SD of observed N 2 O emissions. At stage 1, ensembles formed by the three lowest prediction model errors predicted both yields and N 2 O emissions within experimental uncertainties for 44% and 33% of the crop and grassland growth cycles, respectively. Partial model calibration (stages 2-4) markedly reduced prediction errors of the full model ensemble E-median for crop grain yields (from 36% at stage 1 down to 4% on average) and grassland productivity (from 44% to 27%) and to a lesser and more variable extent for N 2 O emissions. Yield-scaled N 2 O emissions (N 2 O emissions divided by crop yields) were ranked accurately by three-model ensembles across crop species and field sites. The potential of using process-based model ensembles to predict jointly

  1. Uncertainty quantification tools for multiphase gas-solid flow simulations using MFIX

    Energy Technology Data Exchange (ETDEWEB)

    Fox, Rodney O. [Iowa State Univ., Ames, IA (United States); Passalacqua, Alberto [Iowa State Univ., Ames, IA (United States)

    2016-02-01

    Computational fluid dynamics (CFD) has been widely studied and used in the scientific community and in the industry. Various models were proposed to solve problems in different areas. However, all models deviate from reality. Uncertainty quantification (UQ) process evaluates the overall uncertainties associated with the prediction of quantities of interest. In particular it studies the propagation of input uncertainties to the outputs of the models so that confidence intervals can be provided for the simulation results. In the present work, a non-intrusive quadrature-based uncertainty quantification (QBUQ) approach is proposed. The probability distribution function (PDF) of the system response can be then reconstructed using extended quadrature method of moments (EQMOM) and extended conditional quadrature method of moments (ECQMOM). The report first explains the theory of QBUQ approach, including methods to generate samples for problems with single or multiple uncertain input parameters, low order statistics, and required number of samples. Then methods for univariate PDF reconstruction (EQMOM) and multivariate PDF reconstruction (ECQMOM) are explained. The implementation of QBUQ approach into the open-source CFD code MFIX is discussed next. At last, QBUQ approach is demonstrated in several applications. The method is first applied to two examples: a developing flow in a channel with uncertain viscosity, and an oblique shock problem with uncertain upstream Mach number. The error in the prediction of the moment response is studied as a function of the number of samples, and the accuracy of the moments required to reconstruct the PDF of the system response is discussed. The QBUQ approach is then demonstrated by considering a bubbling fluidized bed as example application. The mean particle size is assumed to be the uncertain input parameter. The system is simulated with a standard two-fluid model with kinetic theory closures for the particulate phase implemented into

  2. Uncertainty estimation and ensemble forecast with a chemistry-transport model - Application to air-quality modeling and simulation

    International Nuclear Information System (INIS)

    Mallet, Vivien

    2005-01-01

    The thesis deals with the evaluation of a chemistry-transport model, not primarily with classical comparisons to observations, but through the estimation of its a priori uncertainties due to input data, model formulation and numerical approximations. These three uncertainty sources are studied respectively on the basis of Monte Carlos simulations, multi-models simulations and numerical schemes inter-comparisons. A high uncertainty is found, in output ozone concentrations. In order to overtake the limitations due to the uncertainty, a solution is ensemble forecast. Through combinations of several models (up to forty-eight models) on the basis of past observations, the forecast can be significantly improved. The achievement of this work has also led to develop the innovative modelling-system Polyphemus. (author) [fr

  3. Exploration of DGVM Parameter Solution Space Using Simulated Annealing: Implications for Forecast Uncertainties

    Science.gov (United States)

    Wells, J. R.; Kim, J. B.

    2011-12-01

    Parameters in dynamic global vegetation models (DGVMs) are thought to be weakly constrained and can be a significant source of errors and uncertainties. DGVMs use between 5 and 26 plant functional types (PFTs) to represent the average plant life form in each simulated plot, and each PFT typically has a dozen or more parameters that define the way it uses resource and responds to the simulated growing environment. Sensitivity analysis explores how varying parameters affects the output, but does not do a full exploration of the parameter solution space. The solution space for DGVM parameter values are thought to be complex and non-linear; and multiple sets of acceptable parameters may exist. In published studies, PFT parameters are estimated from published literature, and often a parameter value is estimated from a single published value. Further, the parameters are "tuned" using somewhat arbitrary, "trial-and-error" methods. BIOMAP is a new DGVM created by fusing MAPSS biogeography model with Biome-BGC. It represents the vegetation of North America using 26 PFTs. We are using simulated annealing, a global search method, to systematically and objectively explore the solution space for the BIOMAP PFTs and system parameters important for plant water use. We defined the boundaries of the solution space by obtaining maximum and minimum values from published literature, and where those were not available, using +/-20% of current values. We used stratified random sampling to select a set of grid cells representing the vegetation of the conterminous USA. Simulated annealing algorithm is applied to the parameters for spin-up and a transient run during the historical period 1961-1990. A set of parameter values is considered acceptable if the associated simulation run produces a modern potential vegetation distribution map that is as accurate as one produced by trial-and-error calibration. We expect to confirm that the solution space is non-linear and complex, and that

  4. Constraining Parameter Uncertainty in Simulations of Water and Heat Dynamics in Seasonally Frozen Soil Using Limited Observed Data

    Directory of Open Access Journals (Sweden)

    Mousong Wu

    2016-02-01

    Full Text Available Water and energy processes in frozen soils are important for better understanding hydrologic processes and water resources management in cold regions. To investigate the water and energy balance in seasonally frozen soils, CoupModel combined with the generalized likelihood uncertainty estimation (GLUE method was used. Simulation work on water and heat processes in frozen soil in northern China during the 2012/2013 winter was conducted. Ensemble simulations through the Monte Carlo sampling method were generated for uncertainty analysis. Behavioral simulations were selected based on combinations of multiple model performance index criteria with respect to simulated soil water and temperature at four depths (5 cm, 15 cm, 25 cm, and 35 cm. Posterior distributions for parameters related to soil hydraulic, radiation processes, and heat transport indicated that uncertainties in both input and model structures could influence model performance in modeling water and heat processes in seasonally frozen soils. Seasonal courses in water and energy partitioning were obvious during the winter. Within the day-cycle, soil evaporation/condensation and energy distributions were well captured and clarified as an important phenomenon in the dynamics of the energy balance system. The combination of the CoupModel simulations with the uncertainty-based calibration method provides a way of understanding the seasonal courses of hydrology and energy processes in cold regions with limited data. Additional measurements may be used to further reduce the uncertainty of regulating factors during the different stages of freezing–thawing.

  5. Phenomena-based Uncertainty Quantification in Predictive Coupled- Physics Reactor Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Marvin [Texas A & M Univ., College Station, TX (United States)

    2017-06-12

    This project has sought to develop methodologies, tailored to phenomena that govern nuclearreactor behavior, to produce predictions (including uncertainties) for quantities of interest (QOIs) in the simulation of steady-state and transient reactor behavior. Examples of such predictions include, for each QOI, an expected value as well as a distribution around this value and an assessment of how much of the distribution stems from each major source of uncertainty. The project has sought to test its methodologies by comparing against measured experimental outcomes. The main experimental platform has been a 1-MW TRIGA reactor. This is a flexible platform for a wide range of experiments, including steady state with and without temperature feedback, slow transients with and without feedback, and rapid transients with strong feedback. The original plan was for the primary experimental data to come from in-core neutron detectors. We made considerable progress toward this goal but did not get as far along as we had planned. We have designed, developed, installed, and tested vertical guide tubes, each able to accept a detector or stack of detectors that can be moved axially inside the tube, and we have tested several new detector designs. One of these shows considerable promise.

  6. Phenomena-based Uncertainty Quantification in Predictive Coupled- Physics Reactor Simulations

    International Nuclear Information System (INIS)

    Adams, Marvin

    2017-01-01

    This project has sought to develop methodologies, tailored to phenomena that govern nuclearreactor behavior, to produce predictions (including uncertainties) for quantities of interest (QOIs) in the simulation of steady-state and transient reactor behavior. Examples of such predictions include, for each QOI, an expected value as well as a distribution around this value and an assessment of how much of the distribution stems from each major source of uncertainty. The project has sought to test its methodologies by comparing against measured experimental outcomes. The main experimental platform has been a 1-MW TRIGA reactor. This is a flexible platform for a wide range of experiments, including steady state with and without temperature feedback, slow transients with and without feedback, and rapid transients with strong feedback. The original plan was for the primary experimental data to come from in-core neutron detectors. We made considerable progress toward this goal but did not get as far along as we had planned. We have designed, developed, installed, and tested vertical guide tubes, each able to accept a detector or stack of detectors that can be moved axially inside the tube, and we have tested several new detector designs. One of these shows considerable promise.

  7. Non-intrusive uncertainty quantification of computational fluid dynamics simulations: notes on the accuracy and efficiency

    Science.gov (United States)

    Zimoń, Małgorzata; Sawko, Robert; Emerson, David; Thompson, Christopher

    2017-11-01

    Uncertainty quantification (UQ) is increasingly becoming an indispensable tool for assessing the reliability of computational modelling. Efficient handling of stochastic inputs, such as boundary conditions, physical properties or geometry, increases the utility of model results significantly. We discuss the application of non-intrusive generalised polynomial chaos techniques in the context of fluid engineering simulations. Deterministic and Monte Carlo integration rules are applied to a set of problems, including ordinary differential equations and the computation of aerodynamic parameters subject to random perturbations. In particular, we analyse acoustic wave propagation in a heterogeneous medium to study the effects of mesh resolution, transients, number and variability of stochastic inputs. We consider variants of multi-level Monte Carlo and perform a novel comparison of the methods with respect to numerical and parametric errors, as well as computational cost. The results provide a comprehensive view of the necessary steps in UQ analysis and demonstrate some key features of stochastic fluid flow systems.

  8. Comparison of the uncertainties calculated for the results of radiochemical determinations using the law of propagation of uncertainty and a Monte Carlo simulation

    International Nuclear Information System (INIS)

    Berne, A.

    2001-01-01

    Quantitative determinations of many radioactive analytes in environmental samples are based on a process in which several independent measurements of different properties are taken. The final results that are calculated using the data have to be evaluated for accuracy and precision. The estimate of the standard deviation, s, also called the combined standard uncertainty (CSU) associated with the result of this combined measurement can be used to evaluate the precision of the result. The CSU can be calculated by applying the law of propagation of uncertainty, which is based on the Taylor series expansion of the equation used to calculate the analytical result. The estimate of s can also be obtained from a Monte Carlo simulation. The data used in this simulation includes the values resulting from the individual measurements, the estimate of the variance of each value, including the type of distribution, and the equation used to calculate the analytical result. A comparison is made between these two methods of estimating the uncertainty of the calculated result. (author)

  9. Effects of input data information content on the uncertainty of simulating water resources

    Science.gov (United States)

    Camargos, Carla; Julich, Stefan; Bach, Martin; Breuer, Lutz

    2017-04-01

    Hydrological models like the Soil and Water Assessment Tool (SWAT) demand a large variety of spatial input data. These are commonly available in different resolutions and result from different preprocessing methodologies. Effort is made to apply the most specific data as possible for the study area, which features heterogeneous landscape elements. Most often, modelers prefer to use regional data, especially with fine resolution, which is not always available. Instead, global datasets are considered that are more general. This study investigates how the use of global and regional input datasets may affect the simulation performance and uncertainty of the model. We analyzed eight different setups for the SWAT model, combining two of each Digital Elevation Models (DEM), soil and land use maps of diverse spatial resolution and information content. The models were calibrated to discharge at two stations across the mesoscale Haute-Sûre catchment, which is partly located in the north of Luxembourg and partly in the southeast of Belgium. The region is a rural area of about 743 km2 and mainly covered by forests and complex agricultural system and arable lands. As part of the catchment, the Upper-Sûre Lake is an important source of drinking water for Luxembourgish population, satisfying 30% of the country's demand. The Metropolis Markov Chain Monte Carlo algorithm implemented in the SPOTPY python package was used to infer posterior parameter distributions and assess parameter uncertainty. We are optimizing the mean of the Nash-Sutcliffe Efficiency (NSE) and the logarithm of NSE. We focused on soil physical, groundwater, main channel, land cover management and basin physical process parameters. Preliminary results indicate that the model has the best performance when using the regional DEM and land use map and the global soil map, indicating that SWAT cannot necessarily make use of additional soil information if they are not substantially effecting soil hydrological fluxes

  10. On the predictivity of pore-scale simulations: estimating uncertainties with multilevel Monte Carlo

    KAUST Repository

    Icardi, Matteo

    2016-02-08

    A fast method with tunable accuracy is proposed to estimate errors and uncertainties in pore-scale and Digital Rock Physics (DRP) problems. The overall predictivity of these studies can be, in fact, hindered by many factors including sample heterogeneity, computational and imaging limitations, model inadequacy and not perfectly known physical parameters. The typical objective of pore-scale studies is the estimation of macroscopic effective parameters such as permeability, effective diffusivity and hydrodynamic dispersion. However, these are often non-deterministic quantities (i.e., results obtained for specific pore-scale sample and setup are not totally reproducible by another “equivalent” sample and setup). The stochastic nature can arise due to the multi-scale heterogeneity, the computational and experimental limitations in considering large samples, and the complexity of the physical models. These approximations, in fact, introduce an error that, being dependent on a large number of complex factors, can be modeled as random. We propose a general simulation tool, based on multilevel Monte Carlo, that can reduce drastically the computational cost needed for computing accurate statistics of effective parameters and other quantities of interest, under any of these random errors. This is, to our knowledge, the first attempt to include Uncertainty Quantification (UQ) in pore-scale physics and simulation. The method can also provide estimates of the discretization error and it is tested on three-dimensional transport problems in heterogeneous materials, where the sampling procedure is done by generation algorithms able to reproduce realistic consolidated and unconsolidated random sphere and ellipsoid packings and arrangements. A totally automatic workflow is developed in an open-source code [2015. https://bitbucket.org/micardi/porescalemc.], that include rigid body physics and random packing algorithms, unstructured mesh discretization, finite volume solvers

  11. Weather model performance on extreme rainfall events simulation's over Western Iberian Peninsula

    Science.gov (United States)

    Pereira, S. C.; Carvalho, A. C.; Ferreira, J.; Nunes, J. P.; Kaiser, J. J.; Rocha, A.

    2012-08-01

    This study evaluates the performance of the WRF-ARW numerical weather model in simulating the spatial and temporal patterns of an extreme rainfall period over a complex orographic region in north-central Portugal. The analysis was performed for the December month of 2009, during the Portugal Mainland rainy season. The heavy rainfall to extreme heavy rainfall periods were due to several low surface pressure's systems associated with frontal surfaces. The total amount of precipitation for December exceeded, in average, the climatological mean for the 1971-2000 time period in +89 mm, varying from 190 mm (south part of the country) to 1175 mm (north part of the country). Three model runs were conducted to assess possible improvements in model performance: (1) the WRF-ARW is forced with the initial fields from a global domain model (RunRef); (2) data assimilation for a specific location (RunObsN) is included; (3) nudging is used to adjust the analysis field (RunGridN). Model performance was evaluated against an observed hourly precipitation dataset of 15 rainfall stations using several statistical parameters. The WRF-ARW model reproduced well the temporal rainfall patterns but tended to overestimate precipitation amounts. The RunGridN simulation provided the best results but model performance of the other two runs was good too, so that the selected extreme rainfall episode was successfully reproduced.

  12. Characteristics of sub-daily precipitation extremes in observed data and regional climate model simulations

    Science.gov (United States)

    Beranová, Romana; Kyselý, Jan; Hanel, Martin

    2018-04-01

    The study compares characteristics of observed sub-daily precipitation extremes in the Czech Republic with those simulated by Hadley Centre Regional Model version 3 (HadRM3) and Rossby Centre Regional Atmospheric Model version 4 (RCA4) regional climate models (RCMs) driven by reanalyses and examines diurnal cycles of hourly precipitation and their dependence on intensity and surface temperature. The observed warm-season (May-September) maxima of short-duration (1, 2 and 3 h) amounts show one diurnal peak in the afternoon, which is simulated reasonably well by RCA4, although the peak occurs too early in the model. HadRM3 provides an unrealistic diurnal cycle with a nighttime peak and an afternoon minimum coinciding with the observed maximum for all three ensemble members, which suggests that convection is not captured realistically. Distorted relationships of the diurnal cycles of hourly precipitation to daily maximum temperature in HadRM3 further evidence that underlying physical mechanisms are misrepresented in this RCM. Goodness-of-fit tests indicate that generalised extreme value distribution is an applicable model for both observed and RCM-simulated precipitation maxima. However, the RCMs are not able to capture the range of the shape parameter estimates of distributions of short-duration precipitation maxima realistically, leading to either too many (nearly all for HadRM3) or too few (RCA4) grid boxes in which the shape parameter corresponds to a heavy tail. This means that the distributions of maxima of sub-daily amounts are distorted in the RCM-simulated data and do not match reality well. Therefore, projected changes of sub-daily precipitation extremes in climate change scenarios based on RCMs not resolving convection need to be interpreted with caution.

  13. The use of Monte-Carlo simulation and order statistics for uncertainty analysis of a LBLOCA transient (LOFT-L2-5)

    International Nuclear Information System (INIS)

    Chojnacki, E.; Benoit, J.P.

    2007-01-01

    Best estimate computer codes are increasingly used in nuclear industry for the accident management procedures and have been planned to be used for the licensing procedures. Contrary to conservative codes which are supposed to give penalizing results, best estimate codes attempt to calculate accidental transients in a realistic way. It becomes therefore of prime importance, in particular for technical organization as IRSN in charge of safety assessment, to know the uncertainty on the results of such codes. Thus, CSNI has sponsored few years ago (published in 1998) the Uncertainty Methods Study (UMS) program on uncertainty methodologies used for a SBLOCA transient (LSTF-CL-18) and is now supporting the BEMUSE program for a LBLOCA transient (LOFT-L2-5). The large majority of BEMUSE participants (9 out of 10) use uncertainty methodologies based on a probabilistic modelling and all of them use Monte-Carlo simulations to propagate the uncertainties through their computer codes. Also, all of 'probabilistic participants' intend to use order statistics to determine the sampling size of the Monte-Carlo simulation and to derive the uncertainty ranges associated to their computer calculations. The first aim of this paper is to remind the advantages and also the assumptions of the probabilistic modelling and more specifically of order statistics (as Wilks' formula) in uncertainty methodologies. Indeed Monte-Carlo methods provide flexible and extremely powerful techniques for solving many of the uncertainty propagation problems encountered in nuclear safety analysis. However it is important to keep in mind that probabilistic methods are data intensive. That means, probabilistic methods cannot produce robust results unless a considerable body of information has been collected. A main interest of the use of order statistics results is to allow to take into account an unlimited number of uncertain parameters and, from a restricted number of code calculations to provide statistical

  14. Assessing Uncertainty in Deep Learning Techniques that Identify Atmospheric Rivers in Climate Simulations

    Science.gov (United States)

    Mahesh, A.; Mudigonda, M.; Kim, S. K.; Kashinath, K.; Kahou, S.; Michalski, V.; Williams, D. N.; Liu, Y.; Prabhat, M.; Loring, B.; O'Brien, T. A.; Collins, W. D.

    2017-12-01

    Atmospheric rivers (ARs) can be the difference between CA facing drought or hurricane-level storms. ARs are a form of extreme weather defined as long, narrow columns of moisture which transport water vapor outside the tropics. When they make landfall, they release the vapor as rain or snow. Convolutional neural networks (CNNs), a machine learning technique that uses filters to recognize features, are the leading computer vision mechanism for classifying multichannel images. CNNs have been proven to be effective in identifying extreme weather events in climate simulation output (Liu et. al. 2016, ABDA'16, http://bit.ly/2hlrFNV). Here, we compare three different CNN architectures, tuned with different hyperparameters and training schemes. We compare two-layer, three-layer, four-layer, and sixteen-layer CNNs' ability to recognize ARs in Community Atmospheric Model version 5 output, and we explore the ability of data augmentation and pre-trained models to increase the accuracy of the classifier. Because pre-training the model with regular images (i.e. benches, stoves, and dogs) yielded the highest accuracy rate, this strategy, also known as transfer learning, may be vital in future scientific CNNs, which likely will not have access to a large labelled training dataset. By choosing the most effective CNN architecture, climate scientists can build an accurate historical database of ARs, which can be used to develop a predictive understanding of these phenomena.

  15. Uncertainty analysis of a coupled ecosystem response model simulating greenhouse gas fluxes from a temperate grassland

    Science.gov (United States)

    Liebermann, Ralf; Kraft, Philipp; Houska, Tobias; Breuer, Lutz; Müller, Christoph; Kraus, David; Haas, Edwin; Klatt, Steffen

    2015-04-01

    Among anthropogenic greenhouse gas emissions, CO2 is the dominant driver of global climate change. Next to its direct impact on the radiation budget, it also affects the climate system by triggering feedback mechanisms in terrestrial ecosystems. Such mechanisms - like stimulated photosynthesis, increased root exudations and reduced stomatal transpiration - influence both the input and the turnover of carbon and nitrogen compounds in the soil. The stabilization and decomposition of these compounds determines how increasing CO2 concentrations change the terrestrial trace gas emissions, especially CO2, N2O and CH4. To assess the potential reaction of terrestrial greenhouse gas emissions to rising tropospheric CO2 concentration, we make use of a comprehensive ecosystem model integrating known processes and fluxes of the carbon-nitrogen cycle in soil, vegetation and water. We apply a state-of-the-art ecosystem model with measurements from a long term field experiment of CO2 enrichment. The model - a grassland realization of LandscapeDNDC - simulates soil chemistry coupled with plant physiology, microclimate and hydrology. The data - comprising biomass, greenhouse gas emissions, management practices and soil properties - has been attained from a FACE (Free Air Carbon dioxide Enrichment) experiment running since 1997 on a temperate grassland in Giessen, Germany. Management and soil data, together with weather records, are used to drive the model, while cut biomass as well as CO2 and N2O emissions are used for calibration and validation. Starting with control data from installations without CO2 enhancement, we begin with a GLUE (General Likelihood Uncertainty Estimation) assessment using Latin Hypercube to reduce the range of the model parameters. This is followed by a detailed sensitivity analysis, the application of DREAM-ZS for model calibration, and an estimation of the effect of input uncertainty on the simulation results. Since first results indicate problems with

  16. Rainy Day: A Remote Sensing-Driven Extreme Rainfall Simulation Approach for Hazard Assessment

    Science.gov (United States)

    Wright, Daniel; Yatheendradas, Soni; Peters-Lidard, Christa; Kirschbaum, Dalia; Ayalew, Tibebu; Mantilla, Ricardo; Krajewski, Witold

    2015-04-01

    Progress on the assessment of rainfall-driven hazards such as floods and landslides has been hampered by the challenge of characterizing the frequency, intensity, and structure of extreme rainfall at the watershed or hillslope scale. Conventional approaches rely on simplifying assumptions and are strongly dependent on the location, the availability of long-term rain gage measurements, and the subjectivity of the analyst. Regional and global-scale rainfall remote sensing products provide an alternative, but are limited by relatively short (~15-year) observational records. To overcome this, we have coupled these remote sensing products with a space-time resampling framework known as stochastic storm transposition (SST). SST "lengthens" the rainfall record by resampling from a catalog of observed storms from a user-defined region, effectively recreating the regional extreme rainfall hydroclimate. This coupling has been codified in Rainy Day, a Python-based platform for quickly generating large numbers of probabilistic extreme rainfall "scenarios" at any point on the globe. Rainy Day is readily compatible with any gridded rainfall dataset. The user can optionally incorporate regional rain gage or weather radar measurements for bias correction using the Precipitation Uncertainties for Satellite Hydrology (PUSH) framework. Results from Rainy Day using the CMORPH satellite precipitation product are compared with local observations in two examples. The first example is peak discharge estimation in a medium-sized (~4000 square km) watershed in the central United States performed using CUENCAS, a parsimonious physically-based distributed hydrologic model. The second example is rainfall frequency analysis for Saint Lucia, a small volcanic island in the eastern Caribbean that is prone to landslides and flash floods. The distinct rainfall hydroclimates of the two example sites illustrate the flexibility of the approach and its usefulness for hazard analysis in data-poor regions.

  17. Dynamic Simulation, Sensitivity and Uncertainty Analysis of a Demonstration Scale Lignocellulosic Enzymatic Hydrolysis Process

    DEFF Research Database (Denmark)

    Prunescu, Remus Mihail; Sin, Gürkan

    2014-01-01

    This study presents the uncertainty and sensitivity analysis of a lignocellulosic enzymatic hydrolysis model considering both model and feed parameters as sources of uncertainty. The dynamic model is parametrized for accommodating various types of biomass, and different enzymatic complexes...

  18. Uncertainty assessment of PM2.5 contamination mapping using spatiotemporal sequential indicator simulations and multi-temporal monitoring data

    Science.gov (United States)

    Yang, Yong; Christakos, George; Huang, Wei; Lin, Chengda; Fu, Peihong; Mei, Yang

    2016-04-01

    Because of the rapid economic growth in China, many regions are subjected to severe particulate matter pollution. Thus, improving the methods of determining the spatiotemporal distribution and uncertainty of air pollution can provide considerable benefits when developing risk assessments and environmental policies. The uncertainty assessment methods currently in use include the sequential indicator simulation (SIS) and indicator kriging techniques. However, these methods cannot be employed to assess multi-temporal data. In this work, a spatiotemporal sequential indicator simulation (STSIS) based on a non-separable spatiotemporal semivariogram model was used to assimilate multi-temporal data in the mapping and uncertainty assessment of PM2.5 distributions in a contaminated atmosphere. PM2.5 concentrations recorded throughout 2014 in Shandong Province, China were used as the experimental dataset. Based on the number of STSIS procedures, we assessed various types of mapping uncertainties, including single-location uncertainties over one day and multiple days and multi-location uncertainties over one day and multiple days. A comparison of the STSIS technique with the SIS technique indicate that a better performance was obtained with the STSIS method.

  19. Modeling Nonlinear Site Response Uncertainty in Broadband Ground Motion Simulations for the Los Angeles Basin

    Science.gov (United States)

    Assimaki, D.; Li, W.; Steidl, J. M.; Schmedes, J.

    2007-12-01

    The assessment of strong motion site response is of great significance, both for mitigating seismic hazard and for performing detailed analyses of earthquake source characteristics. There currently exists, however, large degree of uncertainty concerning the mathematical model to be employed for the computationally efficient evaluation of local site effects, and the site investigation program necessary to evaluate the nonlinear input model parameters and ensure cost-effective predictions; and while site response observations may provide critical constraints on interpretation methods, the lack of a statistically significant number of in-situ strong motion records prohibits statistical analyses to be conducted and uncertainties to be quantified based entirely on field data. In this paper, we combine downhole observations and broadband ground motion synthetics for characteristic site conditions the Los Angeles Basin, and investigate the variability in ground motion estimation introduced by the site response assessment methodology. In particular, site-specific regional velocity and attenuation structures are initially compiled using near-surface geotechnical data collected at downhole geotechnical arrays, inverse low-strain velocity and attenuation profiles at these sites obtained by inversion of weak motion records and the crustal velocity structure at the corresponding locations obtained from the Southern California Earthquake Centre Community Velocity Model. Successively, broadband ground motions are simulated by means of a hybrid low/high-frequency finite source model with correlated random parameters for rupture scenaria of weak, medium and large magnitude events (M =3.5-7.5). Observed estimates of site response at the stations of interest are first compared to the ensemble of approximate and incremental nonlinear site response models. Parametric studies are next conducted for each fixed magnitude (fault geometry) scenario by varying the source-to-site distance and

  20. A Study on the uncertainty and sensitivity in numerical simulation of parametric roll

    DEFF Research Database (Denmark)

    Choi, Ju-hyuck; Nielsen, Ulrik Dam; Jensen, Jørgen Juncher

    2016-01-01

    Uncertainties related to numerical modelling of parametric roll have been investigated by using a 6-DOFs model with nonlinear damping and roll restoring forces. At first, uncertainty on damping coefficients and its effect on the roll response is evaluated. Secondly, uncertainty due to the “effect...

  1. Estimation of the uncertainty of a climate model using an ensemble simulation

    Science.gov (United States)

    Barth, A.; Mathiot, P.; Goosse, H.

    2012-04-01

    The atmospheric forcings play an important role in the study of the ocean and sea-ice dynamics of the Southern Ocean. Error in the atmospheric forcings will inevitably result in uncertain model results. The sensitivity of the model results to errors in the atmospheric forcings are studied with ensemble simulations using multivariate perturbations of the atmospheric forcing fields. The numerical ocean model used is the NEMO-LIM in a global configuration with an horizontal resolution of 2°. NCEP reanalyses are used to provide air temperature and wind data to force the ocean model over the last 50 years. A climatological mean is used to prescribe relative humidity, cloud cover and precipitation. In a first step, the model results is compared with OSTIA SST and OSI SAF sea ice concentration of the southern hemisphere. The seasonal behavior of the RMS difference and bias in SST and ice concentration is highlighted as well as the regions with relatively high RMS errors and biases such as the Antarctic Circumpolar Current and near the ice-edge. Ensemble simulations are performed to statistically characterize the model error due to uncertainties in the atmospheric forcings. Such information is a crucial element for future data assimilation experiments. Ensemble simulations are performed with perturbed air temperature and wind forcings. A Fourier decomposition of the NCEP wind vectors and air temperature for 2007 is used to generate ensemble perturbations. The perturbations are scaled such that the resulting ensemble spread matches approximately the RMS differences between the satellite SST and sea ice concentration. The ensemble spread and covariance are analyzed for the minimum and maximum sea ice extent. It is shown that errors in the atmospheric forcings can extend to several hundred meters in depth near the Antarctic Circumpolar Current.

  2. Cosmic rays Monte Carlo simulations for the Extreme Energy Events Project

    CERN Document Server

    Abbrescia, M; Aiola, S; Antolini, R; Avanzini, C; Baldini Ferroli, R; Bencivenni, G; Bossini, E; Bressan, E; Chiavassa, A; Cicalò, C; Cifarelli, L; Coccia, E; De Gruttola, D; De Pasquale, S; Di Giovanni, A; D'Incecco, M; Dreucci, M; Fabbri, F L; Frolov, V; Garbini, M; Gemme, G; Gnesi, I; Gustavino, C; Hatzifotiadou, D; La Rocca, P; Li, S; Librizzi, F; Maggiora, A; Massai, M; Miozzi, S; Panareo, M; Paoletti, R; Perasso, L; Pilo, F; Piragino, G; Regano, A; Riggi, F; Righini, G C; Sartorelli, G; Scapparone, E; Scribano, A; Selvi, M; Serci, S; Siddi, E; Spandre, G; Squarcia, S; Taiuti, M; Tosello, F; Votano, L; Williams, M C S; Yánez, G; Zichichi, A; Zuyeuski, R

    2014-01-01

    The Extreme Energy Events Project (EEE Project) is an innovative experiment to study very high energy cosmic rays by means of the detection of the associated air shower muon component. It consists of a network of tracking detectors installed inside Italian High Schools. Each tracking detector, called EEE telescope, is composed of three Multigap Resistive Plate Chambers (MRPCs). At present, 43 telescopes are installed and taking data, opening the way for the detection of far away coincidences over a total area of about 3 × 10 5 km 2 . In this paper we present the Monte Carlo simulations that have been performed to predict the expected coincidence rate between distant EEE telescopes.

  3. Improving Chemical EOR Simulations and Reducing the Subsurface Uncertainty Using Downscaling Conditioned to Tracer Data

    KAUST Repository

    Torrealba, Victor A.

    2017-10-02

    Recovery mechanisms are more likely to be influenced by grid-block size and reservoir heterogeneity in Chemical EOR (CEOR) than in conventional Water Flood (WF) simulations. Grid upscaling based on single-phase flow is a common practice in WF simulation models, where simulation grids are coarsened to perform history matching and sensitivity analyses within affordable computational times. This coarse grid resolution (typically about 100 ft.) could be sufficient in WF, however, it usually fails to capture key physical mechanisms in CEOR. In addition to increased numerical dispersion in coarse models, these models tend to artificially increase the level of mixing between the fluids and may not have enough resolution to capture different length scales of geological features to which EOR processes can be highly sensitive. As a result of which, coarse models usually overestimate the sweep efficiency, and underestimate the displacement efficiency. Grid refinement (simple downscaling) can resolve artificial mixing but appropriately re-creating the fine-scale heterogeneity, without degrading the history-match conducted on the coarse-scale, remains a challenge. Because of the difference in recovery mechanisms involved in CEOR, such as miscibility and thermodynamic phase split, the impact of grid downscaling on CEOR simulations is not well understood. In this work, we introduce a geostatistical downscaling method conditioned to tracer data to refine a coarse history-matched WF model. This downscaling process is necessary for CEOR simulations when the original (fine) earth model is not available or when major disconnects occur between the original earth model and the history-matched coarse WF model. The proposed downscaling method is a process of refining the coarse grid, and populating the relevant properties in the newly created finer grid cells. The method considers the values of rock properties in the coarse grid as hard data, and the corresponding variograms and property

  4. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Directory of Open Access Journals (Sweden)

    Jakob Jordan

    2018-02-01

    Full Text Available State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  5. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  6. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    Science.gov (United States)

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613

  7. Uncertainty and sensitivity analysis in the neutronic parameters generation for BWR and PWR coupled thermal-hydraulic–neutronic simulations

    International Nuclear Information System (INIS)

    Ánchel, F.; Barrachina, T.; Miró, R.; Verdú, G.; Juanas, J.; Macián-Juan, R.

    2012-01-01

    Highlights: ► Best-estimate codes are affected by the uncertainty in the methods and the models. ► Influence of the uncertainty in the macroscopic cross-sections in a BWR and PWR RIA accidents analysis. ► The fast diffusion coefficient, the scattering cross section and both fission cross sections are the most influential factors. ► The absorption cross sections very little influence. ► Using a normal pdf the results are more “conservative” comparing the power peak reached with uncertainty quantified with a uniform pdf. - Abstract: The Best Estimate analysis consists of a coupled thermal-hydraulic and neutronic description of the nuclear system's behavior; uncertainties from both aspects should be included and jointly propagated. This paper presents a study of the influence of the uncertainty in the macroscopic neutronic information that describes a three-dimensional core model on the most relevant results of the simulation of a Reactivity Induced Accident (RIA). The analyses of a BWR-RIA and a PWR-RIA have been carried out with a three-dimensional thermal-hydraulic and neutronic model for the coupled system TRACE-PARCS and RELAP-PARCS. The cross section information has been generated by the SIMTAB methodology based on the joint use of CASMO-SIMULATE. The statistically based methodology performs a Monte-Carlo kind of sampling of the uncertainty in the macroscopic cross sections. The size of the sampling is determined by the characteristics of the tolerance intervals by applying the Noether–Wilks formulas. A number of simulations equal to the sample size have been carried out in which the cross sections used by PARCS are directly modified with uncertainty, and non-parametric statistical methods are applied to the resulting sample of the values of the output variables to determine their intervals of tolerance.

  8. Multi-catchment rainfall-runoff simulation for extreme flood estimation

    Science.gov (United States)

    Paquet, Emmanuel

    2017-04-01

    The SCHADEX method (Paquet et al., 2013) is a reference method in France for the estimation of extreme flood for dam design. The method is based on a semi-continuous rainfall-runoff simulation process: hundreds of different rainy events, randomly drawn up to extreme values, are simulated independently in the hydrological conditions of each day when a rainy event has been actually observed. This allows generating an exhaustive set of crossings between precipitation and soil saturation hazards, and to build a complete distribution of flood discharges up to extreme quantiles. The hydrological model used within SCHADEX, the MORDOR model (Garçon, 1996), is a lumped model, which implies that hydrological processes, e.g. rainfall and soil saturation, are supposed to be homogeneous throughout the catchment. Snow processes are nevertheless represented in relation with altitude. This hypothesis of homogeneity is questionable especially as the size of the catchment increases, or in areas of highly contrasted climatology (like mountainous areas). Conversely, modeling the catchment with a fully distributed approach would cause different problems, in particular distributing the rainfall-runoff model parameters trough space, and within the SCHADEX stochastic framework, generating extreme rain fields with credible spatio-temporal features. An intermediate solution is presented here. It provides a better representation of the hydro-climatic diversity of the studied catchment (especially regarding flood processes) while keeping the SCHADEX simulation framework. It consists in dividing the catchment in several, more homogeneous sub-catchments. Rainfall-runoff models are parameterized individually for each of them, using local discharge data if available. A first SCHADEX simulation is done at the global scale, which allows assigning a probability to each simulated event, mainly based on the global areal rainfall drawn for the event (see Paquet el al., 2013 for details). Then the

  9. Global sensitivity and uncertainty analysis of the nitrate leaching and crop yield simulation under different water and nitrogen management practices

    Science.gov (United States)

    Agricultural system models have become important tools in studying water and nitrogen (N) dynamics, as well as crop growth, under different management practices. Complexity in input parameters often leads to significant uncertainty when simulating dynamic processes such as nitrate leaching or crop y...

  10. An exploration of the option space in student design projects for uncertainty and sensitivity analysis with performance simulation

    NARCIS (Netherlands)

    Struck, C.; Wilde, de P.J.C.J.; Hopfe, C.J.; Hensen, J.L.M.

    2008-01-01

    This paper describes research conducted to gather empirical evidence on extent, character and content of the option space in building design projects, from the perspective of a climate engineer using building performance simulation for concept evaluation. The goal is to support uncertainty analysis

  11. Durango: Scalable Synthetic Workload Generation for Extreme-Scale Application Performance Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Carothers, Christopher D. [Rensselaer Polytechnic Institute (RPI); Meredith, Jeremy S. [ORNL; Blanco, Marc [Rensselaer Polytechnic Institute (RPI); Vetter, Jeffrey S. [ORNL; Mubarak, Misbah [Argonne National Laboratory; LaPre, Justin [Rensselaer Polytechnic Institute (RPI); Moore, Shirley V. [ORNL

    2017-05-01

    Performance modeling of extreme-scale applications on accurate representations of potential architectures is critical for designing next generation supercomputing systems because it is impractical to construct prototype systems at scale with new network hardware in order to explore designs and policies. However, these simulations often rely on static application traces that can be difficult to work with because of their size and lack of flexibility to extend or scale up without rerunning the original application. To address this problem, we have created a new technique for generating scalable, flexible workloads from real applications, we have implemented a prototype, called Durango, that combines a proven analytical performance modeling language, Aspen, with the massively parallel HPC network modeling capabilities of the CODES framework.Our models are compact, parameterized and representative of real applications with computation events. They are not resource intensive to create and are portable across simulator environments. We demonstrate the utility of Durango by simulating the LULESH application in the CODES simulation environment on several topologies and show that Durango is practical to use for simulation without loss of fidelity, as quantified by simulation metrics. During our validation of Durango's generated communication model of LULESH, we found that the original LULESH miniapp code had a latent bug where the MPI_Waitall operation was used incorrectly. This finding underscores the potential need for a tool such as Durango, beyond its benefits for flexible workload generation and modeling.Additionally, we demonstrate the efficacy of Durango's direct integration approach, which links Aspen into CODES as part of the running network simulation model. Here, Aspen generates the application-level computation timing events, which in turn drive the start of a network communication phase. Results show that Durango's performance scales well when

  12. Kalman filter application to mitigate the errors in the trajectory simulations due to the lunar gravitational model uncertainty

    International Nuclear Information System (INIS)

    Gonçalves, L D; Rocco, E M; De Moraes, R V; Kuga, H K

    2015-01-01

    This paper aims to simulate part of the orbital trajectory of Lunar Prospector mission to analyze the relevance of using a Kalman filter to estimate the trajectory. For this study it is considered the disturbance due to the lunar gravitational potential using one of the most recent models, the LP100K model, which is based on spherical harmonics, and considers the maximum degree and order up to the value 100. In order to simplify the expression of the gravitational potential and, consequently, to reduce the computational effort required in the simulation, in some cases, lower values for degree and order are used. Following this aim, it is made an analysis of the inserted error in the simulations when using such values of degree and order to propagate the spacecraft trajectory and control. This analysis was done using the standard deviation that characterizes the uncertainty for each one of the values of the degree and order used in LP100K model for the satellite orbit. With knowledge of the uncertainty of the gravity model adopted, lunar orbital trajectory simulations may be accomplished considering these values of uncertainty. Furthermore, it was also used a Kalman filter, where is considered the sensor's uncertainty that defines the satellite position at each step of the simulation and the uncertainty of the model, by means of the characteristic variance of the truncated gravity model. Thus, this procedure represents an effort to approximate the results obtained using lower values for the degree and order of the spherical harmonics, to the results that would be attained if the maximum accuracy of the model LP100K were adopted. Also a comparison is made between the error in the satellite position in the situation in which the Kalman filter is used and the situation in which the filter is not used. The data for the comparison were obtained from the standard deviation in the velocity increment of the space vehicle. (paper)

  13. NIST ThermoData Engine: Extension to Solvent Design and Propagation of Uncertainties for Process Simulation

    DEFF Research Database (Denmark)

    Diky, Vladimir; Chirico, Robert D.; Muzny, Chris

    ThermoData Engine (TDE, NIST Standard Reference Databases 103a and 103b) is the first product that implements the concept of Dynamic Data Evaluation in the fields of thermophysics and thermochemistry, which includes maintaining the comprehensive and up-to-date database of experimentally measured ...... uncertainties, curve deviations, and inadequacies of the models. Uncertainty analysis shows relative contributions to the total uncertainty from each component and pair of components....

  14. Treatment simulation approaches for the estimation of the distributions of treatment quality parameters generated by geometrical uncertainties

    International Nuclear Information System (INIS)

    Baum, C; Alber, M; Birkner, M; Nuesslin, F

    2004-01-01

    Geometric uncertainties arise during treatment planning and treatment and mean that dose-dependent parameters such as EUD are random variables with a patient specific probability distribution. Treatment planning with highly conformal treatment techniques such as intensity modulated radiation therapy requires new evaluation tools which allow us to estimate this influence of geometrical uncertainties on the probable treatment dose for a planned dose distribution. Monte Carlo simulations of treatment courses with recalculation of the dose according to the daily geometric errors are a gold standard for such an evaluation. Distribution histograms which show the relative frequency of a treatment quality parameter in the treatment simulations can be used to evaluate the potential risks and chances of a planned dose distribution. As treatment simulations with dose recalculation are very time consuming for sufficient statistical accuracy, it is proposed to do treatment simulations in the dose parameter space where the result is mainly determined by the systematic and random component of the geometrical uncertainties. Comparison of the parameter space simulation method with the gold standard for prostate cases and a head and neck case shows good agreement as long as the number of fractions is high enough and the influence of tissue inhomogeneities and surface curvature on the dose is small

  15. Uncertainty and Sensitivity of Neutron Kinetic Parameters in the Dynamic Response of a PWR Rod Ejection Accident Coupled Simulation

    Directory of Open Access Journals (Sweden)

    C. Mesado

    2012-01-01

    Full Text Available In nuclear safety analysis, it is very important to be able to simulate the different transients that can occur in a nuclear power plant with a very high accuracy. Although the best estimate codes can simulate the transients and provide realistic system responses, the use of nonexact models, together with assumptions and estimations, is a source of uncertainties which must be properly evaluated. This paper describes a Rod Ejection Accident (REA simulated using the coupled code RELAP5/PARCSv2.7 with a perturbation on the cross-sectional sets in order to determine the uncertainties in the macroscopic neutronic information. The procedure to perform the uncertainty and sensitivity (U&S analysis is a sampling-based method which is easy to implement and allows different procedures for the sensitivity analyses despite its high computational time. DAKOTA-Jaguar software package is the selected toolkit for the U&S analysis presented in this paper. The size of the sampling is determined by applying the Wilks’ formula for double tolerance limits with a 95% of uncertainty and with 95% of statistical confidence for the output variables. Each sample has a corresponding set of perturbations that will modify the cross-sectional sets used by PARCS. Finally, the intervals of tolerance of the output variables will be obtained by the use of nonparametric statistical methods.

  16. Make or buy decision considering uncertainty based on fuzzy logic using simulation and multiple criteria decision making

    Directory of Open Access Journals (Sweden)

    Ali Mohtashami

    2013-01-01

    Full Text Available Decision making on making/buying problem has always been a challenge to decision makers. In this paper a methodology has been proposed to resolve this challenge. This methodology is capable of evaluating making/buying decision making under uncertainty. For uncertainty, the fuzzy logic and simulation approaches have been used. The proposed methodology can be applied to parts with multi stage manufacturing processes and different suppliers. Therefore this methodology provides a scale for decision making from full outsourcing to full manufacturing and with selecting appropriate supplier.

  17. Evaluating Uncertainty of Runoff Simulation using SWAT model of the Feilaixia Watershed in China Based on the GLUE Method

    Science.gov (United States)

    Chen, X.; Huang, G.

    2017-12-01

    In recent years, distributed hydrological models have been widely used in storm water management, water resources protection and so on. Therefore, how to evaluate the uncertainty of the model reasonably and efficiently becomes a hot topic today. In this paper, the soil and water assessment tool (SWAT) model is constructed for the study area of China's Feilaixia watershed, and the uncertainty of the runoff simulation is analyzed by GLUE method deeply. Taking the initial parameter range of GLUE method as the research core, the influence of different initial parameter ranges on model uncertainty is studied. In this paper, two sets of parameter ranges are chosen as the object of study, the first one (range 1) is recommended by SWAT-CUP and the second one (range 2) is calibrated by SUFI-2. The results showed that under the same number of simulations (10,000 times), the overall uncertainty obtained by the range 2 is less than the range 1. Specifically, the "behavioral" parameter sets for the range 2 is 10000 and for the range 1 is 4448. In the calibration and the validation, the ratio of P-factor to R-factor for range 1 is 1.387 and 1.391, and for range 2 is 1.405 and 1.462 respectively. In addition, the simulation result of range 2 is better with the NS and R2 slightly higher than range 1. Therefore, it can be concluded that using the parameter range calibrated by SUFI-2 as the initial parameter range for the GLUE is a way to effectively capture and evaluate the simulation uncertainty.

  18. The Snow Darkening Effect and the Simulation of Extremes over Eurasia

    Science.gov (United States)

    Yasunari, T. J.; Lau, W. K. M.; Kim, K. M.; Koster, R. D.

    2014-12-01

    We have recently completed an updated ensemble of NASA GEOS-5 simulations with a snow-darkening module (now officially named GOddard SnoW Impurity Module, or GOSWIM, and summarized in the published paper by Yasunari et al., SOLA, 2014; see at: https://www.jstage.jst.go.jp/article/sola/10/0/10_2014-011/_article). This ensemble ("snow-darkening case (SDC)"), consisting of ten parallel simulations (differing only in their initial conditions) spanning 2002-2011, is compared here to a corresponding ensemble with all snow-darkening effects disabled ("non-SDC"). We focus particularly on the production of extremes associated with snow darkening. To identify regions of interest over Eurasia, we first rank the 100 separate spring (MAM) or summer (JJA) values of a given quantity in each combined 100-yr data (i.e., 10-yr x 10-ensemble), and then compute the differences of the 90th percentile values between SDC and non-SDC. For spring, large differences are seen in a specific area of Europe and Central Asia (ECA), and for summer, they are seen for an area in the Russian Arctic (RA). The next step in our analysis addresses the month-by-month variation of the percentile differences within these identified regions - for each month, and for a given meteorological or hydrological variable, we determined the SDC percentile that corresponds to the 90th percentile value found for the non-SDC ensemble. For example, in the RA domain, the surface air temperature corresponding to the 90th percentile in the non-SDC ensemble has a consistently lower percentile in the SDC data - not only during spring and summer through the increased absorption of radiation by snow polluted with dust, black carbon, and organic carbon, but also in the post-snow season through some form of memory in the system. The temperature extremes in the SDC ensemble thus exceed those of the non-SDC ensemble throughout the year. This analysis supports the idea that the consideration of snow darkening effect in global

  19. Informal uncertainty analysis (GLUE) of continuous flow simulation in a hybrid sewer system with infiltration inflow - Consistency of containment ratios in calibration and validation?

    DEFF Research Database (Denmark)

    Breinholt, Anders; Grum, Morten; Madsen, Henrik

    2013-01-01

    to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction...... rain inputs and more accurate flow observations to reduce parameter and model simulation uncertainty. © Author(s) 2013....

  20. Regional air-sea coupled model simulation for two types of extreme heat in North China

    Science.gov (United States)

    Li, Donghuan; Zou, Liwei; Zhou, Tianjun

    2018-03-01

    Extreme heat (EH) over North China (NC) is affected by both large scale circulations and local topography, and could be categorized into foehn favorable and no-foehn types. In this study, the performance of a regional coupled model in simulating EH over NC was examined. The effects of regional air-sea coupling were also investigated by comparing the results with the corresponding atmosphere-alone regional model. On foehn favorable (no-foehn) EH days, a barotropic cyclonic (anticyclonic) anomaly is located to the northeast (northwest) of NC, while anomalous northwesterlies (southeasterlies) prevail over NC in the lower troposphere. In the uncoupled simulation, barotropic anticyclonic bias occurs over China on both foehn favorable and no-foehn EH days, and the northwesterlies in the lower troposphere on foehn favorable EH days are not obvious. These biases are significantly reduced in the regional coupled simulation, especially on foehn favorable EH days with wind anomalies skill scores improving from 0.38 to 0.47, 0.47 to 0.61 and 0.38 to 0.56 for horizontal winds at 250, 500 and 850 hPa, respectively. Compared with the uncoupled simulation, the reproduction of the longitudinal position of Northwest Pacific subtropical high (NPSH) and the spatial pattern of the low-level monsoon flow over East Asia are improved in the coupled simulation. Therefore, the anticyclonic bias over China is obviously reduced, and the proportion of EH days characterized by anticyclonic anomaly is more appropriate. The improvements in the regional coupled model indicate that it is a promising choice for the future projection of EH over NC.

  1. Kinetic turbulence simulations at extreme scale on leadership-class systems

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Bei [Princeton Univ., Princeton, NJ (United States); Ethier, Stephane [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Tang, William [Princeton Univ., Princeton, NJ (United States); Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Williams, Timothy [Argonne National Lab. (ANL), Argonne, IL (United States); Ibrahim, Khaled Z. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Madduri, Kamesh [The Pennsylvania State Univ., University Park, PA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Oliker, Leonid [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2013-01-01

    Reliable predictive simulation capability addressing confinement properties in magnetically confined fusion plasmas is critically-important for ITER, a 20 billion dollar international burning plasma device under construction in France. The complex study of kinetic turbulence, which can severely limit the energy confinement and impact the economic viability of fusion systems, requires simulations at extreme scale for such an unprecedented device size. Our newly optimized, global, ab initio particle-in-cell code solving the nonlinear equations underlying gyrokinetic theory achieves excellent performance with respect to "time to solution" at the full capacity of the IBM Blue Gene/Q on 786,432 cores of Mira at ALCF and recently of the 1,572,864 cores of Sequoia at LLNL. Recent multithreading and domain decomposition optimizations in the new GTC-P code represent critically important software advances for modern, low memory per core systems by enabling routine simulations at unprecedented size (130 million grid points ITER-scale) and resolution (65 billion particles).

  2. When 1+1 can be >2: Uncertainties compound when simulating climate, fisheries and marine ecosystems

    Science.gov (United States)

    Evans, Karen; Brown, Jaclyn N.; Sen Gupta, Alex; Nicol, Simon J.; Hoyle, Simon; Matear, Richard; Arrizabalaga, Haritz

    2015-03-01

    Multi-disciplinary approaches that combine oceanographic, biogeochemical, ecosystem, fisheries population and socio-economic models are vital tools for modelling whole ecosystems. Interpreting the outputs from such complex models requires an appreciation of the many different types of modelling frameworks being used and their associated limitations and uncertainties. Both users and developers of particular model components will often have little involvement or understanding of other components within such modelling frameworks. Failure to recognise limitations and uncertainties associated with components and how these uncertainties might propagate throughout modelling frameworks can potentially result in poor advice for resource management. Unfortunately, many of the current integrative frameworks do not propagate the uncertainties of their constituent parts. In this review, we outline the major components of a generic whole of ecosystem modelling framework incorporating the external pressures of climate and fishing. We discuss the limitations and uncertainties associated with each component of such a modelling system, along with key research gaps. Major uncertainties in modelling frameworks are broadly categorised into those associated with (i) deficient knowledge in the interactions of climate and ocean dynamics with marine organisms and ecosystems; (ii) lack of observations to assess and advance modelling efforts and (iii) an inability to predict with confidence natural ecosystem variability and longer term changes as a result of external drivers (e.g. greenhouse gases, fishing effort) and the consequences for marine ecosystems. As a result of these uncertainties and intrinsic differences in the structure and parameterisation of models, users are faced with considerable challenges associated with making appropriate choices on which models to use. We suggest research directions required to address these uncertainties, and caution against overconfident predictions

  3. Propagation of void fraction uncertainty measures in the RETRAN-3D simulation of the Peach Bottom turbine trip

    International Nuclear Information System (INIS)

    Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh

    2011-01-01

    The paper describes the propagation of void fraction uncertainty, as quantified by employing a novel methodology developed at Paul Scherrer Institut, in the RETRAN-3D simulation of the Peach Bottom turbine trip test. Since the transient considered is characterized by a strong coupling between thermal-hydraulics and neutronics, the accuracy in the void fraction model has a very important influence on the prediction of the power history and, in particular, of the maximum power reached. It has been shown that the objective measures used for the void fraction uncertainty, based on the direct comparison between experimental and predicted values extracted from a database of appropriate separate-effect tests, provides power uncertainty bands that are narrower and more realistic than those based, for example, on expert opinion. The applicability of such an approach to best estimate, nuclear power plant transient analysis has thus been demonstrated.

  4. Essays in energy policy and planning modeling under uncertainty: Value of information, optimistic biases, and simulation of capacity markets

    Science.gov (United States)

    Hu, Ming-Che

    Optimization and simulation are popular operations research and systems analysis tools for energy policy modeling. This dissertation addresses three important questions concerning the use of these tools for energy market (and electricity market) modeling and planning under uncertainty. (1) What is the value of information and cost of disregarding different sources of uncertainty for the U.S. energy economy? (2) Could model-based calculations of the performance (social welfare) of competitive and oligopolistic market equilibria be optimistically biased due to uncertainties in objective function coefficients? (3) How do alternative sloped demand curves perform in the PJM capacity market under economic and weather uncertainty? How does curve adjustment and cost dynamics affect the capacity market outcomes? To address the first question, two-stage stochastic optimization is utilized in the U.S. national MARKAL energy model; then the value of information and cost of ignoring uncertainty are estimated for three uncertainties: carbon cap policy, load growth and natural gas prices. When an uncertainty is important, then explicitly considering those risks when making investments will result in better performance in expectation (positive expected cost of ignoring uncertainty). Furthermore, eliminating the uncertainty would improve strategies even further, meaning that improved forecasts of future conditions are valuable ( i.e., a positive expected value of information). Also, the value of policy coordination shows the difference between a strategy developed under the incorrect assumption of no carbon cap and a strategy correctly anticipating imposition of such a cap. For the second question, game theory models are formulated and the existence of optimistic (positive) biases in market equilibria (both competitive and oligopoly markets) are proved, in that calculated social welfare and producer profits will, in expectation, exceed the values that will actually be received

  5. Wearing a Wetsuit Alters Upper Extremity Motion during Simulated Surfboard Paddling.

    Directory of Open Access Journals (Sweden)

    J A Nessler

    Full Text Available Surfers often wear wetsuits while paddling in the ocean. This neoprene covering may be beneficial to upper extremity movement by helping to improve proprioceptive acuity, or it may be detrimental by providing increased resistance. The purpose of this study was to evaluate the effects of wearing a wetsuit on muscle activation, upper extremity motion, heart rate, and oxygen consumption during simulated surfboard paddling in the laboratory. Twelve male, recreational surfers performed two paddling trials at a constant workload on a swim bench ergometer both with and without a wetsuit. Kinematic data and EMG were acquired from the right arm via motion capture, and oxygen consumption and heart rate were recorded with a metabolic cart and heart rate monitor. Wearing a wetsuit had no significant effect on oxygen consumption or heart rate. A significant increase in EMG activation was observed for the middle deltoid but not for any of the other shoulder muscle evaluated. Finally, approximate entropy and estimates of the maximum Lyapunov exponent increased significantly for vertical trajectory of the right wrist (i.e. stroke height when a wetsuit was worn. These results suggest that a 2mm wetsuit has little effect on the energy cost of paddling at lower workloads but does affect arm motion. These changes may be the result of enhanced proprioceptive acuity due to mechanical compression from the wetsuit.

  6. Development and simulation of a passive upper extremity orthosis for amyoplasia

    DEFF Research Database (Denmark)

    Jensen, Erik Føge; Raunsbæk, Joakim; Lund, Jan Nørgaard

    2018-01-01

    Introduction People who are born with arthrogryposis multiplex congenita are typically not able to perform activities of daily living (ADL) due to decreased muscle mass, joint contractures and unnatural upper extremity positioning. They are, therefore, potential users of an assistive device capable....... Results For a given configuration using a mono- and a bi-articular spring, the simulations showed that spring stiffnesses of 400?Nm?1 and of 1029?Nm?1, respectively, were able to lower the maximal muscle activity estimated by the musculoskeletal model to a level in which the 10 postures can be realized....... Conclusion By augmenting residual muscle strength with a partially gravity-balanced passive orthosis, ADLs may be achievable for people with arthrogryposis multiplex congenita....

  7. Inverse uncertainty quantification of reactor simulations under the Bayesian framework using surrogate models constructed by polynomial chaos expansion

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Xu, E-mail: xuwu2@illinois.edu; Kozlowski, Tomasz

    2017-03-15

    Modeling and simulations are naturally augmented by extensive Uncertainty Quantification (UQ) and sensitivity analysis requirements in the nuclear reactor system design, in which uncertainties must be quantified in order to prove that the investigated design stays within acceptance criteria. Historically, expert judgment has been used to specify the nominal values, probability density functions and upper and lower bounds of the simulation code random input parameters for the forward UQ process. The purpose of this paper is to replace such ad-hoc expert judgment of the statistical properties of input model parameters with inverse UQ process. Inverse UQ seeks statistical descriptions of the model random input parameters that are consistent with the experimental data. Bayesian analysis is used to establish the inverse UQ problems based on experimental data, with systematic and rigorously derived surrogate models based on Polynomial Chaos Expansion (PCE). The methods developed here are demonstrated with the Point Reactor Kinetics Equation (PRKE) coupled with lumped parameter thermal-hydraulics feedback model. Three input parameters, external reactivity, Doppler reactivity coefficient and coolant temperature coefficient are modeled as uncertain input parameters. Their uncertainties are inversely quantified based on synthetic experimental data. Compared with the direct numerical simulation, surrogate model by PC expansion shows high efficiency and accuracy. In addition, inverse UQ with Bayesian analysis can calibrate the random input parameters such that the simulation results are in a better agreement with the experimental data.

  8. Performance and Uncertainty Evaluation of Snow Models on Snowmelt Flow Simulations over a Nordic Catchment (Mistassibi, Canada

    Directory of Open Access Journals (Sweden)

    Magali Troin

    2015-11-01

    Full Text Available An analysis of hydrological response to a multi-model approach based on an ensemble of seven snow models (SM; degree-day and mixed degree-day/energy balance models coupled with three hydrological models (HM is presented for a snowmelt-dominated basin in Canada. The present study aims to compare the performance and the reliability of different types of SM-HM combinations at simulating snowmelt flows over the 1961–2000 historical period. The multi-model approach also allows evaluating the uncertainties associated with the structure of the SM-HM ensemble to better predict river flows in Nordic environments. The 20-year calibration shows a satisfactory performance of the ensemble of 21 SM-HM combinations at simulating daily discharges and snow water equivalents (SWEs, with low streamflow volume biases. The validation of the ensemble of 21 SM-HM combinations is conducted over a 20-year period. Performances are similar to the calibration in simulating the daily discharges and SWEs, again with low model biases for streamflow. The spring-snowmelt-generated peak flow is captured only in timing by the ensemble of 21 SM-HM combinations. The results of specific hydrologic indicators show that the uncertainty related to the choice of the given HM in the SM-HM combinations cannot be neglected in a more quantitative manner in simulating snowmelt flows. The selection of the SM plays a larger role than the choice of the SM approach (degree-day versus mixed degree-day/energy balance in simulating spring flows. Overall, the snow models provide a low degree of uncertainty to the total uncertainty in hydrological modeling for snow hydrology studies.

  9. Evaluation of uncertainties in mean and extreme precipitation under climate change for northwestern Mediterranean watersheds from high-resolution Med and Euro-CORDEX ensembles

    Science.gov (United States)

    Colmet-Daage, Antoine; Sanchez-Gomez, Emilia; Ricci, Sophie; Llovel, Cécile; Borrell Estupina, Valérie; Quintana-Seguí, Pere; Llasat, Maria Carmen; Servat, Eric

    2018-01-01

    The climate change impact on mean and extreme precipitation events in the northern Mediterranean region is assessed using high-resolution EuroCORDEX and MedCORDEX simulations. The focus is made on three regions, Lez and Aude located in France, and Muga located in northeastern Spain, and eight pairs of global and regional climate models are analyzed with respect to the SAFRAN product. First the model skills are evaluated in terms of bias for the precipitation annual cycle over historical period. Then future changes in extreme precipitation, under two emission scenarios, are estimated through the computation of past/future change coefficients of quantile-ranked model precipitation outputs. Over the 1981-2010 period, the cumulative precipitation is overestimated for most models over the mountainous regions and underestimated over the coastal regions in autumn and higher-order quantile. The ensemble mean and the spread for future period remain unchanged under RCP4.5 scenario and decrease under RCP8.5 scenario. Extreme precipitation events are intensified over the three catchments with a smaller ensemble spread under RCP8.5 revealing more evident changes, especially in the later part of the 21st century.

  10. Impact of Optimized Land Surface Parameters on the Land-Atmosphere Coupling in WRF Simulations of Dry and Wet Extremes

    Science.gov (United States)

    Kumar, S.; Santanello, J. A.; Peters-Lidard, C. D.; Harrison, K.

    2011-12-01

    Land-atmosphere (L-A) interactions play a critical role in determining the diurnal evolution of both planetary boundary layer (PBL) and land surface temperature and moisture budgets, as well as controlling feedbacks with clouds and precipitation that lead to the persistence of dry and wet regimes. Recent efforts to quantify the strength of L-A coupling in prediction models have produced diagnostics that integrate across both the land and PBL components of the system. In this study, we examine the impact of improved specification of land surface states, anomalies, and fluxes on coupled WRF forecasts during the summers of extreme dry (2006) and wet (2007) conditions in the U.S. Southern Great Plains. The improved land initialization and surface flux parameterizations are obtained through the use of a new optimization and uncertainty module in NASA's Land Information System (LIS-OPT), whereby parameter sets are calibrated in the Noah land surface model and classified according to the land cover and soil type mapping of the observations and the full domain. The impact of the calibrated parameters on the a) spinup of land surface states used as initial conditions, and b) heat and moisture fluxes of the coupled (LIS-WRF) simulations are then assessed in terms of ambient weather, PBL budgets, and precipitation along with L-A coupling diagnostics. In addition, the sensitivity of this approach to the period of calibration (dry, wet, normal) is investigated. Finally, tradeoffs of computational tractability and scientific validity (e.g.,. relating to the representation of the spatial dependence of parameters) and the feasibility of calibrating to multiple observational datasets are also discussed.

  11. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    Science.gov (United States)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in

  12. Validation and uncertainty quantification of Fuego simulations of calorimeter heating in a wind-driven hydrocarbon pool fire.

    Energy Technology Data Exchange (ETDEWEB)

    Domino, Stefan Paul; Figueroa, Victor G.; Romero, Vicente Jose; Glaze, David Jason; Sherman, Martin P.; Luketa-Hanlin, Anay Josephine

    2009-12-01

    The objective of this work is to perform an uncertainty quantification (UQ) and model validation analysis of simulations of tests in the cross-wind test facility (XTF) at Sandia National Laboratories. In these tests, a calorimeter was subjected to a fire and the thermal response was measured via thermocouples. The UQ and validation analysis pertains to the experimental and predicted thermal response of the calorimeter. The calculations were performed using Sierra/Fuego/Syrinx/Calore, an Advanced Simulation and Computing (ASC) code capable of predicting object thermal response to a fire environment. Based on the validation results at eight diversely representative TC locations on the calorimeter the predicted calorimeter temperatures effectively bound the experimental temperatures. This post-validates Sandia's first integrated use of fire modeling with thermal response modeling and associated uncertainty estimates in an abnormal-thermal QMU analysis.

  13. Analysis of Uncertainties in Protection Heater Delay Time Measurements and Simulations in Nb$_{3}$Sn High-Field Accelerator Magnets

    CERN Document Server

    Salmi, Tiina; Marchevsky, Maxim; Bajas, Hugo; Felice, Helene; Stenvall, Antti

    2015-01-01

    The quench protection of superconducting high-field accelerator magnets is presently based on protection heaters, which are activated upon quench detection to accelerate the quench propagation within the winding. Estimations of the heater delay to initiate a normal zone in the coil are essential for the protection design. During the development of Nb3Sn magnets for the LHC luminosity upgrade, protection heater delays have been measured in several experiments, and a new computational tool CoHDA (Code for Heater Delay Analysis) has been developed for heater design. Several computational quench analyses suggest that the efficiency of the present heater technology is on the borderline of protecting the magnets. Quantifying the inevitable uncertainties related to the measured and simulated delays is therefore of pivotal importance. In this paper, we analyze the uncertainties in the heater delay measurements and simulations using data from five impregnated high-field Nb3Sn magnets with different heater geometries. ...

  14. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    into account the main error sources for the measurement. This method has the potential to deal with all kinds of systematic and random errors that influence a dimensional CT measurement. A case study demonstrates the practical application of the VCT simulator using numerically generated CT data and statistical......The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...

  15. Analysis of Uncertainties in Protection Heater Delay Time Measurements and Simulations in Nb$_{3}$Sn High-Field Accelerator Magnets

    CERN Document Server

    Salmi, Tiina; Marchevsky, Maxim; Bajas, Hugo; Felice, Helene; Stenvall, Antti

    2015-01-01

    The quench protection of superconducting high-field accelerator magnets is presently based on protection heaters, which are activated upon quench detection to accelerate the quench propagation within the winding. Estimations of the heater delay to initiate a normal zone in the coil are essential for the protection design. During the development of Nb$_{3}$Sn magnets for the LHC luminosity upgrade, protection heater delays have been measured in several experiments, and a new computational tool CoHDA (Code for Heater Delay Analysis) has been developed for heater design. Several computational quench analyses suggest that the efficiency of the present heater technology is on the borderline of protecting the magnets. Quantifying the inevitable uncertainties related to the measured and simulated delays is therefore of pivotal importance. In this paper, we analyze the uncertainties in the heater delay measurements and simulations using data from five impregnated high-field Nb$_{3}$Sn magnets with different heater ge...

  16. The Parallel C++ Statistical Library ‘QUESO’: Quantification of Uncertainty for Estimation, Simulation and Optimization

    KAUST Repository

    Prudencio, Ernesto E.; Schulz, Karl W.

    2012-01-01

    QUESO is a collection of statistical algorithms and programming constructs supporting research into the uncertainty quantification (UQ) of models and their predictions. It has been designed with three objectives: it should (a) be sufficiently

  17. Tactical Decision Making under Categorical Uncertainty with Applications to Modeling and Simulation

    National Research Council Canada - National Science Library

    Kemmerer, Kacey E

    2008-01-01

    ...) and individual differences affect response time in decision-making tasks. The researchers elicited real-world tactical scenarios from veterans of Operation Enduring Freedom and Operation Iraqi Freedom in which uncertainty was present...

  18. Observed and simulated hydrologic response for a first-order catchment during extreme rainfall 3 years after wildfire disturbance

    Science.gov (United States)

    Ebel, Brian A.; Rengers, Francis K.; Tucker, Gregory E.

    2016-01-01

    Hydrologic response to extreme rainfall in disturbed landscapes is poorly understood because of the paucity of measurements. A unique opportunity presented itself when extreme rainfall in September 2013 fell on a headwater catchment (i.e., soil-hydraulic properties, soil saturation from subsurface sensors, and estimated peak runoff during the extreme rainfall with numerical simulations of runoff generation and subsurface hydrologic response during this event. The simulations were used to explore differences in runoff generation between the wildfire-affected headwater catchment, a simulated unburned case, and for uniform versus spatially variable parameterizations of soil-hydraulic properties that affect infiltration and runoff generation in burned landscapes. Despite 3 years of elapsed time since the 2010 wildfire, observations and simulations pointed to substantial surface runoff generation in the wildfire-affected headwater catchment by the infiltration-excess mechanism while no surface runoff was generated in the unburned case. The surface runoff generation was the result of incomplete recovery of soil-hydraulic properties in the burned area, suggesting recovery takes longer than 3 years. Moreover, spatially variable soil-hydraulic property parameterizations produced longer duration but lower peak-flow infiltration-excess runoff, compared to uniform parameterization, which may have important hillslope sediment export and geomorphologic implications during long duration, extreme rainfall. The majority of the simulated surface runoff in the spatially variable cases came from connected near-channel contributing areas, which was a substantially smaller contributing area than the uniform simulations.

  19. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    International Nuclear Information System (INIS)

    Gomes, Daniel S.; Teixeira, Antonio S.

    2017-01-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  20. Simulating fuel behavior under transient conditions using FRAPTRAN and uncertainty analysis using Dakota

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Daniel S.; Teixeira, Antonio S., E-mail: dsgomes@ipen.br, E-mail: teixeira@ipen [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Although regulatory agencies have shown a special interest in incorporating best estimate approaches in the fuel licensing process, fuel codes are currently licensed based on only the deterministic limits such as those seen in 10CRF50, and therefore, may yield unrealistic safety margins. The concept of uncertainty analysis is employed to more realistically manage this risk. In this study, uncertainties were classified into two categories: probabilistic and epistemic (owing to a lack of pre-existing knowledge in this area). Fuel rods have three sources of uncertainty: manufacturing tolerance, boundary conditions, and physical models. The first step in successfully analyzing the uncertainties involves performing a statistical analysis on the input parameters used throughout the fuel code. The response obtained from this analysis must show proportional index correlations because the uncertainties are globally propagated. The Dakota toolkit was used to analyze the FRAPTRAN transient fuel code. The subsequent sensitivity analyses helped in identifying the key parameters with the highest correlation indices including the peak cladding temperature and the time required for cladding failures. The uncertainty analysis was performed using an IFA-650-5 fuel rod, which was in line with the tests performed in the Halden Project in Norway. The main objectives of the Halden project included studying the ballooning and rupture processes. The results of this experiment demonstrate the accuracy and applicability of the physical models in evaluating the thermal conductivity, mechanical model, and fuel swelling formulations. (author)

  1. Improving simulated long-term responses of vegetation to temperature and precipitation extremes using the ACME land model

    Science.gov (United States)

    Ricciuto, D. M.; Warren, J.; Guha, A.

    2017-12-01

    While carbon and energy fluxes in current Earth system models generally have reasonable instantaneous responses to extreme temperature and precipitation events, they often do not adequately represent the long-term impacts of these events. For example, simulated net primary productivity (NPP) may decrease during an extreme heat wave or drought, but may recover rapidly to pre-event levels following the conclusion of the extreme event. However, field measurements indicate that long-lasting damage to leaves and other plant components often occur, potentially affecting the carbon and energy balance for months after the extreme event. The duration and frequency of such extreme conditions is likely to shift in the future, and therefore it is critical for Earth system models to better represent these processes for more accurate predictions of future vegetation productivity and land-atmosphere feedbacks. Here we modify the structure of the Accelerated Climate Model for Energy (ACME) land surface model to represent long-term impacts and test the improved model against observations from experiments that applied extreme conditions in growth chambers. Additionally, we test the model against eddy covariance measurements that followed extreme conditions at selected locations in North America, and against satellite-measured vegetation indices following regional extreme events.

  2. Rainfall and runoff Intensity-Duration-Frequency Curves for Washington State considering the change and uncertainty of observed and anticipated extreme rainfall and snow events

    Science.gov (United States)

    Demissie, Y. K.; Mortuza, M. R.; Li, H. Y.

    2015-12-01

    The observed and anticipated increasing trends in extreme storm magnitude and frequency, as well as the associated flooding risk in the Pacific Northwest highlighted the need for revising and updating the local intensity-duration-frequency (IDF) curves, which are commonly used for designing critical water infrastructure. In Washington State, much of the drainage system installed in the last several decades uses IDF curves that are outdated by as much as half a century, making the system inadequate and vulnerable for flooding as seen more frequently in recent years. In this study, we have developed new and forward looking rainfall and runoff IDF curves for each county in Washington State using recently observed and projected precipitation data. Regional frequency analysis coupled with Bayesian uncertainty quantification and model averaging methods were used to developed and update the rainfall IDF curves, which were then used in watershed and snow models to develop the runoff IDF curves that explicitly account for effects of snow and drainage characteristic into the IDF curves and related designs. The resulted rainfall and runoff IDF curves provide more reliable, forward looking, and spatially resolved characteristics of storm events that can assist local decision makers and engineers to thoroughly review and/or update the current design standards for urban and rural storm water management infrastructure in order to reduce the potential ramifications of increasing severe storms and resulting floods on existing and planned storm drainage and flood management systems in the state.

  3. Simulating the Impacts of Climate Extremes Across Sectors: The Case of the 2003 European Heat Wave

    Science.gov (United States)

    Schewe, J.; Zhao, F.; Reyer, C.; Breuer, L.; Coll, M.; Deryng, D.; Eddy, T.; Elliott, J. W.; Francois, L. M.; Friend, A. D.; Gerten, D.; Gosling, S.; Gudmundsson, L.; Huber, V.; Kim, H.; Lotze, H. K.; Orth, R.; Seneviratne, S. I.; Tittensor, D.; Vautard, R.; van Vliet, M. T. H.; Wada, Y.

    2017-12-01

    Increased occurrence of extreme climate or weather events is one of the most damaging consequences of global climate change today and in the future. Estimating the impacts of such extreme events across different human and natural systems is crucial for quantifying overall risks from climate change. Are current models fit for this task? Here we use the 2003 European heat wave and drought (EHW) as a historical analogue for comparable events in the future, and evaluate how accurately its impacts are reproduced by a multi-sectoral "super-ensemble" of state-of-the-art impacts models. Our study combines, for the first time, impacts on agriculture, freshwater resources, terrestrial and marine ecosystems, energy, and human health in a consistent multi-model framework. We identify key impacts of the 2003 EHW reported in the literature and/or recorded in publicly available databases, and examine how closely the models reproduce those impacts, applying the same measure of impact magnitude across different sectors. Preliminary results are mixed: While the EHW's impacts on water resources (streamflow) are reproduced well by most global hydrological models, not all crop and natural vegetation models reproduce the magnitude of impacts on agriculture and ecosystem productivity, respectively, and their performance varies by country or region. A hydropower capacity model matches reported hydropower generation anomalies only in some countries, and estimates of heat-related excess mortality from a set of statistical models are consistent with literature reports only for some of the cities investigated. We present a synthesis of simulated and observed impacts across sectors, and reflect on potential improvements in modeling and analyzing cross-sectoral impacts.

  4. Uncertainty analysis in Titan ionospheric simulated ion mass spectra: unveiling a set of issues for models accuracy improvement

    Science.gov (United States)

    Hébrard, Eric; Carrasco, Nathalie; Dobrijevic, Michel; Pernot, Pascal

    Ion Neutral Mass Spectrometer (INMS) aboard Cassini revealed a rich coupled ion-neutral chemistry in the ionosphere, producing heavy hydrocarbons and nitriles ions. The modeling of such a complex environment is challenging, as it requires a detailed and accurate description of the different relevant processes such as photodissociation cross sections and neutral-neutral reaction rates on one hand, and ionisation cross sections, ion-molecule and recombination reaction rates on the other hand. Underpinning models calculations, each of these processes is parameterized by kinetic constants which, when known, have been studied experimentally and/or theoretically over a range of temperatures and pressures that are most often not representative of Titan's atmosphere. The sizeable experimental and theoretical uncertainties reported in the literature merge therefore with the uncertainties resulting subsequently from the unavoidable estimations or extrapolations to Titan's atmosphere conditions. Such large overall uncertainties have to be accounted for in all resulting inferences most of all to evaluate the quality of the model definition. We have undertaken a systematic study of the uncertainty sources in the simulation of ion mass spectra as recorded by Cassini/INMS in Titan ionosphere during the T5 flyby at 1200 km. Our simulated spectra seem much less affected by the uncertainties on ion-molecule reactions than on neutral-neutral reactions. Photochemical models of Titan's atmosphere are indeed so poorly predictive at high altitudes, in the sense that their computed predictions display such large uncertainties, that we found them to give rise to bimodal and hypersensitive abundance distributions for some major compounds like acetylene C2 H2 and ethylene C2 H4 . We will show to what extent global uncertainty and sensitivity analysis enabled us to identify the causes of this bimodality and to pinpoint the key processes that mostly contribute to limit the accuracy of the

  5. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    Science.gov (United States)

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  6. Simulation and uncertainties of the heat transfer from a heat-generating DEBRIS bed in the lower plenum

    International Nuclear Information System (INIS)

    Schaaf, K.; Trambauer, K.

    1999-01-01

    The findings of the TMI-2 post-accident analyses indicated that internal cooling mechanisms may have a considerable potential to sustain the vessel integrity after a relocation of core material to the lower plenum, provided that water is continuously available in the RPV. Numerous analytical and experimental research activities are currently underway in this respect. This paper illustrates some major findings of the experimental work on internal cooling mechanisms and describes the limitations and the uncertainties in the simulation of the heat transfer processes. Reference is made especially to the joint German DEBRIS/ RPV research program, which encompasses the experimental investigation of the thermal-hydraulics in gaps, of the heat transfer within a particulate debris bed, and of the high temperature performance of vessel steel, as well as the development of simulation models for the heat transfer in the lower head and the structural response of the RPV. In particular, the results of uncertainty and sensitivity analyses are presented, which have been carried out at GRS using an integral model that describes the major phenomena governing the long-term integrity of the reactor vessel. The investigation of a large-scale relocation indicated that the verification of a gap cooling mechanism as an inherent mechanism is questionable in terms of a stringent probabilistic uncertainty criterion, as long as the formation of a large molten pool cannot be excluded. (author)

  7. Extreme scale multi-physics simulations of the tsunamigenic 2004 Sumatra megathrust earthquake

    Science.gov (United States)

    Ulrich, T.; Gabriel, A. A.; Madden, E. H.; Wollherr, S.; Uphoff, C.; Rettenberger, S.; Bader, M.

    2017-12-01

    SeisSol (www.seissol.org) is an open-source software package based on an arbitrary high-order derivative Discontinuous Galerkin method (ADER-DG). It solves spontaneous dynamic rupture propagation on pre-existing fault interfaces according to non-linear friction laws, coupled to seismic wave propagation with high-order accuracy in space and time (minimal dispersion errors). SeisSol exploits unstructured meshes to account for complex geometries, e.g. high resolution topography and bathymetry, 3D subsurface structure, and fault networks. We present the up-to-date largest (1500 km of faults) and longest (500 s) dynamic rupture simulation modeling the 2004 Sumatra-Andaman earthquake. We demonstrate the need for end-to-end-optimization and petascale performance of scientific software to realize realistic simulations on the extreme scales of subduction zone earthquakes: Considering the full complexity of subduction zone geometries leads inevitably to huge differences in element sizes. The main code improvements include a cache-aware wave propagation scheme and optimizations of the dynamic rupture kernels using code generation. In addition, a novel clustered local-time-stepping scheme for dynamic rupture has been established. Finally, asynchronous output has been implemented to overlap I/O and compute time. We resolve the frictional sliding process on the curved mega-thrust and a system of splay faults, as well as the seismic wave field and seafloor displacement with frequency content up to 2.2 Hz. We validate the scenario by geodetic, seismological and tsunami observations. The resulting rupture dynamics shed new light on the activation and importance of splay faults.

  8. Validation/Uncertainty Quantification for Large Eddy Simulations of the heat flux in the Tangentially Fired Oxy-Coal Alstom Boiler Simulation Facility

    Energy Technology Data Exchange (ETDEWEB)

    Smith, P.J.; Eddings, E.G.; Ring, T.; Thornock, J.; Draper, T.; Isaac, B.; Rezeai, D.; Toth, P.; Wu, Y.; Kelly, K.

    2014-08-01

    The objective of this task is to produce predictive capability with quantified uncertainty bounds for the heat flux in commercial-scale, tangentially fired, oxy-coal boilers. Validation data came from the Alstom Boiler Simulation Facility (BSF) for tangentially fired, oxy-coal operation. This task brings together experimental data collected under Alstom’s DOE project for measuring oxy-firing performance parameters in the BSF with this University of Utah project for large eddy simulation (LES) and validation/uncertainty quantification (V/UQ). The Utah work includes V/UQ with measurements in the single-burner facility where advanced strategies for O2 injection can be more easily controlled and data more easily obtained. Highlights of the work include: • Simulations of Alstom’s 15 megawatt (MW) BSF, exploring the uncertainty in thermal boundary conditions. A V/UQ analysis showed consistency between experimental results and simulation results, identifying uncertainty bounds on the quantities of interest for this system (Subtask 9.1) • A simulation study of the University of Utah’s oxy-fuel combustor (OFC) focused on heat flux (Subtask 9.2). A V/UQ analysis was used to show consistency between experimental and simulation results. • Measurement of heat flux and temperature with new optical diagnostic techniques and comparison with conventional measurements (Subtask 9.3). Various optical diagnostics systems were created to provide experimental data to the simulation team. The final configuration utilized a mid-wave infrared (MWIR) camera to measure heat flux and temperature, which was synchronized with a high-speed, visible camera to utilize two-color pyrometry to measure temperature and soot concentration. • Collection of heat flux and temperature measurements in the University of Utah’s OFC for use is subtasks 9.2 and 9.3 (Subtask 9.4). Several replicates were carried to better assess the experimental error. Experiments were specifically designed for the

  9. Uncertainty in return period analysis of combined sewer overflow effects using embedded Monte Carlo simulations

    NARCIS (Netherlands)

    Grum, M.; Aalderink, R.H.

    1999-01-01

    The rerun periods of detrimental effects ate often used as design criteria in urban storm water management. Considerable uncertainty is associated with the models used. This is either ignored or pooled with the inherent event to event variation such as rainfall depth It is here argued that

  10. Predicting the solubility of gases in Nitrile Butadiene Rubber in extreme conditions using molecular simulation

    Science.gov (United States)

    Khawaja, Musab; Molinari, Nicola; Sutton, Adrian; Mostofi, Arash

    In the oil and gas industry, elastomer seals play an important role in protecting sensitive monitoring equipment from contamination by gases - a problem that is exacerbated by the high pressures and temperatures found down-hole. The ability to predict and prevent such permeative failure has proved elusive to-date. Nitrile butadiene rubber (NBR) is a common choice of elastomer for seals due to its resistance to heat and fuels. In the conditions found in the well it readily absorbs small molecular weight gases. How this behaviour changes quantitatively for different gases as a function of temperature and pressure is not well-understood. In this work a series of fully atomistic simulations are performed to understand the effect of extreme conditions on gas solubility in NBR. Widom particle insertion is used to compute solubilities. The importance of sampling and allowing structural relaxation upon compression are highlighted, and qualitatively reasonable trends reproduced. Finally, while at STP it has previously been shown that the solubility of CO2 is higher than that of He in NBR, we observe that under the right circumstances it is possible to reverse this trend.

  11. 360⁰ -View of Quantum Theory and Ab Initio Simulation at Extreme Conditions: 2014 Sanibel Symposium

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Hai-Ping [Univ. of Florida, Gainesville, FL (United States)

    2016-09-02

    The Sanibel Symposium 2014 was held February 16-21, 2014, at the King and Prince, St. Simons Island, GA. It was successful in bringing condensed-matter physicists and quantum chemists together productively to drive the emergence of those specialties. The Symposium had a significant role in preparing a whole generation of quantum theorists. The 54th Sanibel meeting looked to the future in two ways. We had 360⁰-View sessions to honor the exceptional contributions of Rodney Bartlett (70), Bill Butler (70), Yngve Öhrn (80), Fritz Schaefer (70), and Malcolm Stocks (70). The work of these five has greatly impacted several generations of quantum chemists and condensed matter physicists. The “360⁰” is the sum of their ages. More significantly, it symbolizes a panoramic view of critical developments and accomplishments in theoretical and computational chemistry and physics oriented toward the future. Thus, two of the eight 360⁰-View sessions focused specifically on younger scientists. The 360⁰-View program was the major component of the 2014 Sanibel meeting. Another four sessions included a sub-symposium on ab initio Simulations at Extreme Conditions, with focus on getting past the barriers of present-day Born-Oppenheimer molecular dynamics by advances in finite-temperature density functional theory, orbital-free DFT, and new all-numerical approaches.

  12. Future changes in extreme precipitation in the Rhine basin based on global and regional climate model simulations

    NARCIS (Netherlands)

    Pelt, van S.C.; Beersma, J.J.; Buishand, T.A.; Hurk, van den B.J.J.M.; Kabat, P.

    2012-01-01

    Probability estimates of the future change of extreme precipitation events are usually based on a limited number of available global climate model (GCM) or regional climate model (RCM) simulations. Since floods are related to heavy precipitation events, this restricts the assessment of flood risks.

  13. Flood Simulations and Uncertainty Analysis for the Pearl River Basin Using the Coupled Land Surface and Hydrological Model System

    Directory of Open Access Journals (Sweden)

    Yongnan Zhu

    2017-06-01

    Full Text Available The performances of hydrological simulations for the Pearl River Basin in China were analysed using the Coupled Land Surface and Hydrological Model System (CLHMS. Three datasets, including East Asia (EA, high-resolution gauge satellite-merged China Merged Precipitation Analysis (CMPA-Daily, and the Asian Precipitation Highly-Resolved Observational Data Integration Towards Evaluation (APHRODITE daily precipitation were used to drive the CLHMS model to simulate daily hydrological processes from 1998 to 2006. The results indicate that the precipitation data was the most important source of uncertainty in the hydrological simulation. The simulated streamflow driven by the CMPA-Daily agreed well with observations, with a Pearson correlation coefficient (PMC greater than 0.70 and an index of agreement (IOA similarity coefficient greater than 0.82 at Liuzhou, Shijiao, and Wuzhou Stations. Comparison of the Nash-Sutcliffe efficiency coefficient (NSE shows that the peak flow simulation ability of CLHMS driven with the CMPA-Daily rainfall is relatively superior to that with the EA and APHRODITE datasets. The simulation results for the high-flow periods in 1998 and 2005 indicate that the CLHMS is promising for its future application in the flood simulation and prediction.

  14. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    CERN Document Server

    Genser, Krzysztof; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H

    2016-01-01

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  15. Uncertainty Assessments of 2D and Axisymmetric Hypersonic Shock Wave - Turbulent Boundary Layer Interaction Simulations at Compression Corners

    Science.gov (United States)

    Gnoffo, Peter A.; Berry, Scott A.; VanNorman, John W.

    2011-01-01

    This paper is one of a series of five papers in a special session organized by the NASA Fundamental Aeronautics Program that addresses uncertainty assessments for CFD simulations in hypersonic flow. Simulations of a shock emanating from a compression corner and interacting with a fully developed turbulent boundary layer are evaluated herein. Mission relevant conditions at Mach 7 and Mach 14 are defined for a pre-compression ramp of a scramjet powered vehicle. Three compression angles are defined, the smallest to avoid separation losses and the largest to force a separated flow engaging more complicated flow physics. The Baldwin-Lomax and the Cebeci-Smith algebraic models, the one-equation Spalart-Allmaras model with the Catrix-Aupoix compressibility modification and two-equation models including Menter SST, Wilcox k-omega 98, and Wilcox k-omega 06 turbulence models are evaluated. Each model is fully defined herein to preclude any ambiguity regarding model implementation. Comparisons are made to existing experimental data and Van Driest theory to provide preliminary assessment of model form uncertainty. A set of coarse grained uncertainty metrics are defined to capture essential differences among turbulence models. Except for the inability of algebraic models to converge for some separated flows there is no clearly superior model as judged by these metrics. A preliminary metric for the numerical component of uncertainty in shock-turbulent-boundary-layer interactions at compression corners sufficiently steep to cause separation is defined as 55%. This value is a median of differences with experimental data averaged for peak pressure and heating and for extent of separation captured in new, grid-converged solutions presented here. This value is consistent with existing results in a literature review of hypersonic shock-turbulent-boundary-layer interactions by Roy and Blottner and with more recent computations of MacLean.

  16. Nanomaterials under extreme environments: A study of structural and dynamic properties using reactive molecular dynamics simulations

    Science.gov (United States)

    Shekhar, Adarsh

    Nanotechnology is becoming increasingly important with the continuing advances in experimental techniques. As researchers around the world are trying to expand the current understanding of the behavior of materials at the atomistic scale, the limited resolution of equipment, both in terms of time and space, act as roadblocks to a comprehensive study. Numerical methods, in general and molecular dynamics, in particular act as able compliment to the experiments in our quest for understanding material behavior. In this research work, large scale molecular dynamics simulations to gain insight into the mechano-chemical behavior under extreme conditions of a variety of systems with many real world applications. The body of this work is divided into three parts, each covering a particular system: 1) Aggregates of aluminum nanoparticles are good solid fuel due to high flame propagation rates. Multi-million atom molecular dynamics simulations reveal the mechanism underlying higher reaction rate in a chain of aluminum nanoparticles as compared to an isolated nanoparticle. This is due to the penetration of hot atoms from reacting nanoparticles to an adjacent, unreacted nanoparticle, which brings in external heat and initiates exothermic oxidation reactions. 2) Cavitation bubbles readily occur in fluids subjected to rapid changes in pressure. We use billion-atom reactive molecular dynamics simulations on a 163,840-processor BlueGene/P supercomputer to investigate chemical and mechanical damages caused by shock-induced collapse of nanobubbles in water near amorphous silica. Collapse of an empty nanobubble generates high-speed nanojet, resulting in the formation of a pit on the surface. The pit contains a large number of silanol groups and its volume is found to be directly proportional to the volume of the nanobubble. The gas-filled bubbles undergo partial collapse and consequently the damage on the silica surface is mitigated. 3) The structure and dynamics of water confined in

  17. Combining historical eyewitness accounts on tsunami-induced waves and numerical simulations for getting insights in uncertainty of source parameters

    Science.gov (United States)

    Rohmer, Jeremy; Rousseau, Marie; Lemoine, Anne; Pedreros, Rodrigo; Lambert, Jerome; benki, Aalae

    2017-04-01

    Recent tsunami events including the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami have caused many casualties and damages to structures. Advances in numerical simulation of tsunami-induced wave processes have tremendously improved forecast, hazard and risk assessment and design of early warning for tsunamis. Among the major challenges, several studies have underlined uncertainties in earthquake slip distributions and rupture processes as major contributor on tsunami wave height and inundation extent. Constraining these uncertainties can be performed by taking advantage of observations either on tsunami waves (using network of water level gauge) or on inundation characteristics (using field evidence and eyewitness accounts). Despite these successful applications, combining tsunami observations and simulations still faces several limitations when the problem is addressed for past tsunamis events like 1755 Lisbon. 1) While recent inversion studies can benefit from current modern networks (e.g., tide gauges, sea bottom pressure gauges, GPS-mounted buoys), the number of tide gauges can be very scarce and testimonies on tsunami observations can be limited, incomplete and imprecise for past tsunamis events. These observations often restrict to eyewitness accounts on wave heights (e.g., maximum reached wave height at the coast) instead of the full observed waveforms; 2) Tsunami phenomena involve a large span of spatial scales (from ocean basin scales to local coastal wave interactions), which can make the modelling very demanding: the computation time cost of tsunami simulation can be very prohibitive; often reaching several hours. This often limits the number of allowable long-running simulations for performing the inversion, especially when the problem is addressed from a Bayesian inference perspective. The objective of the present study is to overcome both afore-described difficulties in the view to combine historical observations on past tsunami-induced waves

  18. Improving Simulations of Extreme Flows by Coupling a Physically-based Hydrologic Model with a Machine Learning Model

    Science.gov (United States)

    Mohammed, K.; Islam, A. S.; Khan, M. J. U.; Das, M. K.

    2017-12-01

    With the large number of hydrologic models presently available along with the global weather and geographic datasets, streamflows of almost any river in the world can be easily modeled. And if a reasonable amount of observed data from that river is available, then simulations of high accuracy can sometimes be performed after calibrating the model parameters against those observed data through inverse modeling. Although such calibrated models can succeed in simulating the general trend or mean of the observed flows very well, more often than not they fail to adequately simulate the extreme flows. This causes difficulty in tasks such as generating reliable projections of future changes in extreme flows due to climate change, which is obviously an important task due to floods and droughts being closely connected to people's lives and livelihoods. We propose an approach where the outputs of a physically-based hydrologic model are used as an input to a machine learning model to try and better simulate the extreme flows. To demonstrate this offline-coupling approach, the Soil and Water Assessment Tool (SWAT) was selected as the physically-based hydrologic model, the Artificial Neural Network (ANN) as the machine learning model and the Ganges-Brahmaputra-Meghna (GBM) river system as the study area. The GBM river system, located in South Asia, is the third largest in the world in terms of freshwater generated and forms the largest delta in the world. The flows of the GBM rivers were simulated separately in order to test the performance of this proposed approach in accurately simulating the extreme flows generated by different basins that vary in size, climate, hydrology and anthropogenic intervention on stream networks. Results show that by post-processing the simulated flows of the SWAT models with ANN models, simulations of extreme flows can be significantly improved. The mean absolute errors in simulating annual maximum/minimum daily flows were minimized from 4967

  19. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    Science.gov (United States)

    Genser, Krzysztof; Hatcher, Robert; Kelsey, Michael; Perdue, Gabriel; Wenzel, Hans; Wright, Dennis H.; Yarba, Julia

    2017-10-01

    The Geant4 simulation toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models rely on measured cross-sections and phenomenological models with the physically motivated parameters that are tuned to cover many application domains. To study what uncertainties are associated with the Geant4 physics models we have designed and implemented a comprehensive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertainties associated with the simulation model choices. Based on modern event-processing infrastructure software, the toolkit offers a variety of attractive features, e.g. flexible run-time configurable workflow, comprehensive bookkeeping, easy to expand collection of analytical components. Design, implementation technology, and key functionalities of the toolkit are presented in this paper and illustrated with selected results.

  20. A simulation study of organizational decision making under conditions of uncertainty and ambiguity .

    OpenAIRE

    Athens, Arthur J.

    1983-01-01

    Approved for public release; distribution in unlimited. The usual frameworks applied to the analysis of military decision making describe the decision process according to the rational model. The assumptions inherent in this model. however, are not consistent with the reality of warfare's inherent uncertainty and complexity. A better model is needed to address the ambiguilty actually confronting the combat commander. The garbage can model of organizational choice, a nonrational approach to...

  1. Evaluation of simulated corn yields and associated uncertainty in different climate zones of China using Daycent Model

    Science.gov (United States)

    Fu, A.; Xue, Y.

    2017-12-01

    Corn is one of most important agricultural production in China. Research on the simulation of corn yields and the impacts of climate change and agricultural management practices on corn yields is important in maintaining the stable corn production. After climatic data including daily temperature, precipitation, solar radiation, relative humidity, and wind speed from 1948 to 2010, soil properties, observed corn yields, and farmland management information were collected, corn yields grown in humidity and hot environment (Sichuang province) and cold and dry environment (Hebei province) in China in the past 63 years were simulated by Daycent, and the results was evaluated based on published yield record. The relationship between regional climate change, global warming and corn yield were analyzed, the uncertainties of simulation derived from agricultural management practices by changing fertilization levels, land fertilizer maintenance and tillage methods were reported. The results showed that: (1) Daycent model is capable to simulate corn yields under the different climatic background in China. (2) When studying the relationship between regional climate change and corn yields, it has been found that observed and simulated corn yields increased along with total regional climate change. (3) When studying the relationship between the global warming and corn yields, It was discovered that newly-simulated corn yields after removing the global warming trend of original temperature data were lower than before.

  2. Uncertainty quantification's role in modeling and simulation planning, and credibility assessment through the predictive capability maturity model

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Witkowski, Walter R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-04-13

    The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationally simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). There were several critical activities that follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. Here, we introduce the subject matter for general applications but specifics are given for the failure prediction project. In addition, the first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.

  3. Northern peatland Collembola communities unaffected by three summers of simulated extreme precipitation

    NARCIS (Netherlands)

    Krab, E.J.; Aerts, R.; Berg, M.P.; van Hal, J.R.; Keuper, F.

    2014-01-01

    Extreme climate events are observed and predicted to increase in frequency and duration in high-latitude ecosystems as a result of global climate change. This includes extreme precipitation events, which may directly impact on belowground food webs and ecosystem functioning by their physical impacts

  4. 48{sup th} Annual meeting on nuclear technology (AMNT 2017). Key topic / Enhanced safety and operation excellence. Focus session: Uncertainty analyses in reactor core simulations

    Energy Technology Data Exchange (ETDEWEB)

    Zwermann, Winfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany). Forschungszentrum

    2017-12-15

    The supplementation of reactor simulations by uncertainty analyses is becoming increasingly important internationally due to the fact that the reliability of simulation calculations can be significantly increased by the quantification of uncertainties in comparison to the use of so-called conservative methods (BEPU- ''Best-Estimate plus Uncertainties''). While systematic uncertainty analyses for thermo-hydraulic calculations have been performed routinely for a long time, methods for taking into account uncertainties in nuclear data, which are the basis for neutron transport calculations, are under development. The Focus Session Uncertainty Analyses in Reactor Core Simulations was intended to provide an overview of international research and development with respect to supplementing reactor core simulations with uncertainty and sensitivity analyses, in research institutes as well as within the nuclear industry. The presented analyses not only focused on light water reactors, but also on advanced reactor systems. Particular emphasis was put on international benchmarks in the field. The session was chaired by Winfried Zwermann (Gesellschaft fuer Anlagen- und Reaktorsicherheit).

  5. Uncertainty and sensitivity analysis in building performance simulation for decision support and design optimization

    NARCIS (Netherlands)

    Hopfe, C.J.

    2009-01-01

    Building performance simulation (BPS) uses computer-based models that cover performance aspects such as energy consumption and thermal comfort in buildings. The uptake of BPS in current building design projects is limited. Although there is a large number of building simulation tools available, the

  6. Return period estimates of extreme sea level along the east coast of India from numerical simulations

    Digital Repository Service at National Institute of Oceanography (India)

    Sindhu, B.; Unnikrishnan, A.S.

    . The simulated total sea level and the surge component were obtained for each event. The simulated peak levels showed good agreement with the observations available at few stations. The annual maxima of sea levels, extracted from the simulations, were fitted...

  7. The Parallel C++ Statistical Library ‘QUESO’: Quantification of Uncertainty for Estimation, Simulation and Optimization

    KAUST Repository

    Prudencio, Ernesto E.

    2012-01-01

    QUESO is a collection of statistical algorithms and programming constructs supporting research into the uncertainty quantification (UQ) of models and their predictions. It has been designed with three objectives: it should (a) be sufficiently abstract in order to handle a large spectrum of models, (b) be algorithmically extensible, allowing an easy insertion of new and improved algorithms, and (c) take advantage of parallel computing, in order to handle realistic models. Such objectives demand a combination of an object-oriented design with robust software engineering practices. QUESO is written in C++, uses MPI, and leverages libraries already available to the scientific community. We describe some UQ concepts, present QUESO, and list planned enhancements.

  8. Merging Methods to Manage Uncertainty: Combining Simulation Modeling and Scenario Planning to Inform Resource Management Under Climate Change

    Science.gov (United States)

    Miller, B. W.; Schuurman, G. W.; Symstad, A.; Fisichelli, N. A.; Frid, L.

    2017-12-01

    Managing natural resources in this era of anthropogenic climate change is fraught with uncertainties around how ecosystems will respond to management actions and a changing climate. Scenario planning (oftentimes implemented as a qualitative, participatory exercise for exploring multiple possible futures) is a valuable tool for addressing this challenge. However, this approach may face limits in resolving responses of complex systems to altered climate and management conditions, and may not provide the scientific credibility that managers often require to support actions that depart from current practice. Quantitative information on projected climate changes and ecological responses is rapidly growing and evolving, but this information is often not at a scale or in a form that is `actionable' for resource managers. We describe a project that sought to create usable information for resource managers in the northern Great Plains by combining qualitative and quantitative methods. In particular, researchers, resource managers, and climate adaptation specialists co-produced a simulation model in conjunction with scenario planning workshops to inform natural resource management in southwest South Dakota. Scenario planning for a wide range of resources facilitated open-minded thinking about a set of divergent and challenging, yet relevant and plausible, climate scenarios and management alternatives that could be implemented in the simulation. With stakeholder input throughout the process, we built a simulation of key vegetation types, grazing, exotic plants, fire, and the effects of climate and management on rangeland productivity and composition. By simulating multiple land management jurisdictions, climate scenarios, and management alternatives, the model highlighted important tradeoffs between herd sizes and vegetation composition, and between the short- versus long-term costs of invasive species management. It also identified impactful uncertainties related to the

  9. Uncertainty versus variability in Monte Carlo simulations of human exposure through food pathways

    International Nuclear Information System (INIS)

    McKone, T.E.

    1994-01-01

    An important issue in both the risk characterization and subsequent risk management of contaminated soil is how precisely we can characterize the distribution among individuals of potential doses associated with chemical contaminants in soil and whether this level of precision favors the use of population distributions of exposure over the use of single scenario representations. For lipophilic contaminants, such as dioxins, furans, polychlorinated biphenyls, pesticides, and for metals such as lead and mercury, exposures through food have been demonstrated to be dominant contributors to total dose within non-occupationally exposed populations. However, overall uncertainties in estimating potential doses through food chains are much larger than uncertainties associated with other exposure pathways. A general model is described here for estimating the ratio of potential dose to contaminant concentration in soil for homegrown foods contaminated by lipophilic, nonionic organic chemicals. This model includes parameters describing homegrown food consumption rates, exposure duration, biotransfer factors, and partition factors. For the parameters needed in this model, the mean and variance are often the only moments of the parameter distribution available. Parameters are divided into three categories, uncertain parameters, variable parameters, and mixed uncertain/variable parameters. Using soils contaminated by hexachlorbenzene (HCB) and benzo(a)pyrene (BaP) as cases studies, a stepwise Monte Carlo analysis is used to develop a histogram that apportions variance in the outcome (ratio of potential dose by food pathways to soil concentration) to variance in each of the three input categories. The results represent potential doses in households consuming homegrown foods

  10. Stochastic reservoir simulation for the modeling of uncertainty in coal seam degasification

    Science.gov (United States)

    Karacan, C. Özgen; Olea, Ricardo A.

    2015-01-01

    Coal seam degasification improves coal mine safety by reducing the gas content of coal seams and also by generating added value as an energy source. Coal seam reservoir simulation is one of the most effective ways to help with these two main objectives. As in all modeling and simulation studies, how the reservoir is defined and whether observed productions can be predicted are important considerations.

  11. Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System

    Science.gov (United States)

    2017-08-01

    related to the numerical structuring of a problem, such as cell size, domain extent, and system orientation. Depth of penetration of a threat into a... system in the simulation codes is tied to the domain structure , with coordinate axes aligned with cell edges. However, the position of the coordinate...physical systems are generally described by sets of equations involving continuous variables, such as time and position. Computational simulations

  12. Simulation and Measurement of Through-the-Earth, Extremely Low-Frequency Signals Using Copper-Clad Steel Ground Rods

    OpenAIRE

    Damiano, Nicholas William; Yan, Lincan; Whisner, Bruce; Zhou, Chenming

    2017-01-01

    The underground mining environment can greatly affect radio signal propagation. Understanding how the earth affects signal propagation is a key to evaluating communications systems used during a mine emergency. One type of communication system is through-the-earth, which can utilize extremely low frequencies (ELF). This paper presents the simulation and measurement results of recent National Institute for Occupational Safety and Health (NIOSH) research aimed at investigating current injection...

  13. Assessment of the Weather Research and Forecasting (WRF) model for simulation of extreme rainfall events in the upper Ganga Basin

    Science.gov (United States)

    Chawla, Ila; Osuri, Krishna K.; Mujumdar, Pradeep P.; Niyogi, Dev

    2018-02-01

    Reliable estimates of extreme rainfall events are necessary for an accurate prediction of floods. Most of the global rainfall products are available at a coarse resolution, rendering them less desirable for extreme rainfall analysis. Therefore, regional mesoscale models such as the advanced research version of the Weather Research and Forecasting (WRF) model are often used to provide rainfall estimates at fine grid spacing. Modelling heavy rainfall events is an enduring challenge, as such events depend on multi-scale interactions, and the model configurations such as grid spacing, physical parameterization and initialization. With this background, the WRF model is implemented in this study to investigate the impact of different processes on extreme rainfall simulation, by considering a representative event that occurred during 15-18 June 2013 over the Ganga Basin in India, which is located at the foothills of the Himalayas. This event is simulated with ensembles involving four different microphysics (MP), two cumulus (CU) parameterizations, two planetary boundary layers (PBLs) and two land surface physics options, as well as different resolutions (grid spacing) within the WRF model. The simulated rainfall is evaluated against the observations from 18 rain gauges and the Tropical Rainfall Measuring Mission Multi-Satellite Precipitation Analysis (TMPA) 3B42RT version 7 data. From the analysis, it should be noted that the choice of MP scheme influences the spatial pattern of rainfall, while the choice of PBL and CU parameterizations influences the magnitude of rainfall in the model simulations. Further, the WRF run with Goddard MP, Mellor-Yamada-Janjic PBL and Betts-Miller-Janjic CU scheme is found to perform best in simulating this heavy rain event. The selected configuration is evaluated for several heavy to extremely heavy rainfall events that occurred across different months of the monsoon season in the region. The model performance improved through incorporation

  14. Probing dark energy models with extreme pairwise velocities of galaxy clusters from the DEUS-FUR simulations

    Science.gov (United States)

    Bouillot, Vincent R.; Alimi, Jean-Michel; Corasaniti, Pier-Stefano; Rasera, Yann

    2015-06-01

    Observations of colliding galaxy clusters with high relative velocity probe the tail of the halo pairwise velocity distribution with the potential of providing a powerful test of cosmology. As an example it has been argued that the discovery of the Bullet Cluster challenges standard Λ cold dark matter (ΛCDM) model predictions. Halo catalogues from N-body simulations have been used to estimate the probability of Bullet-like clusters. However, due to simulation volume effects previous studies had to rely on a Gaussian extrapolation of the pairwise velocity distribution to high velocities. Here, we perform a detail analysis using the halo catalogues from the Dark Energy Universe Simulation Full Universe Runs (DEUS-FUR), which enables us to resolve the high-velocity tail of the distribution and study its dependence on the halo mass definition, redshift and cosmology. Building upon these results, we estimate the probability of Bullet-like systems in the framework of Extreme Value Statistics. We show that the tail of extreme pairwise velocities significantly deviates from that of a Gaussian, moreover it carries an imprint of the underlying cosmology. We find the Bullet Cluster probability to be two orders of magnitude larger than previous estimates, thus easing the tension with the ΛCDM model. Finally, the comparison of the inferred probabilities for the different DEUS-FUR cosmologies suggests that observations of extreme interacting clusters can provide constraints on dark energy models complementary to standard cosmological tests.

  15. Simulation of extreme rainfall event of November 2009 over Jeddah, Saudi Arabia: the explicit role of topography and surface heating

    Science.gov (United States)

    Almazroui, Mansour; Raju, P. V. S.; Yusef, A.; Hussein, M. A. A.; Omar, M.

    2018-04-01

    In this paper, a nonhydrostatic Weather Research and Forecasting (WRF) model has been used to simulate the extreme precipitation event of 25 November 2009, over Jeddah, Saudi Arabia. The model is integrated in three nested (27, 9, and 3 km) domains with the initial and boundary forcing derived from the NCEP reanalysis datasets. As a control experiment, the model integrated for 48 h initiated at 0000 UTC on 24 November 2009. The simulated rainfall in the control experiment depicts in well agreement with Tropical Rainfall Measurement Mission rainfall estimates in terms of intensity as well as spatio-temporal distribution. Results indicate that a strong low-level (850 hPa) wind over Jeddah and surrounding regions enhanced the moisture and temperature gradient and created a conditionally unstable atmosphere that favored the development of the mesoscale system. The influences of topography and heat exchange process in the atmosphere were investigated on the development of extreme precipitation event; two sensitivity experiments are carried out: one without topography and another without exchange of surface heating to the atmosphere. The results depict that both surface heating and topography played crucial role in determining the spatial distribution and intensity of the extreme rainfall over Jeddah. The topography favored enhanced uplift motion that further strengthened the low-level jet and hence the rainfall over Jeddah and adjacent areas. On the other hand, the absence of surface heating considerably reduced the simulated rainfall by 30% as compared to the observations.

  16. Coupled large-eddy simulation and morphodynamics of a large-scale river under extreme flood conditions

    Science.gov (United States)

    Khosronejad, Ali; Sotiropoulos, Fotis; Stony Brook University Team

    2016-11-01

    We present a coupled flow and morphodynamic simulations of extreme flooding in 3 km long and 300 m wide reach of the Mississippi River in Minnesota, which includes three islands and hydraulic structures. We employ the large-eddy simulation (LES) and bed-morphodynamic modules of the VFS-Geophysics model to investigate the flow and bed evolution of the river during a 500 year flood. The coupling of the two modules is carried out via a fluid-structure interaction approach using a nested domain approach to enhance the resolution of bridge scour predictions. The geometrical data of the river, islands and structures are obtained from LiDAR, sub-aqueous sonar and in-situ surveying to construct a digital map of the river bathymetry. Our simulation results for the bed evolution of the river reveal complex sediment dynamics near the hydraulic structures. The numerically captured scour depth near some of the structures reach a maximum of about 10 m. The data-driven simulation strategy we present in this work exemplifies a practical simulation-based-engineering-approach to investigate the resilience of infrastructures to extreme flood events in intricate field-scale riverine systems. This work was funded by a Grant from Minnesota Dept. of Transportation.

  17. Estimation of numerical uncertainty in computational fluid dynamics simulations of a passively controlled wave energy converter

    DEFF Research Database (Denmark)

    Wang, Weizhi; Wu, Minghao; Palm, Johannes

    2018-01-01

    for almost linear incident waves. First, we show that the computational fluid dynamics simulations have acceptable agreement to experimental data. We then present a verification and validation study focusing on the solution verification covering spatial and temporal discretization, iterative and domain......The wave loads and the resulting motions of floating wave energy converters are traditionally computed using linear radiation–diffraction methods. Yet for certain cases such as survival conditions, phase control and wave energy converters operating in the resonance region, more complete...... dynamics simulations have largely been overlooked in the wave energy sector. In this article, we apply formal verification and validation techniques to computational fluid dynamics simulations of a passively controlled point absorber. The phase control causes the motion response to be highly nonlinear even...

  18. Simulating range-wide population and breeding habitat dynamics for an endangered woodland warbler in the face of uncertainty

    Science.gov (United States)

    Adam Duarte,; Hatfield, Jeffrey; Todd M. Swannack,; Michael R. J. Forstner,; M. Clay Green,; Floyd W. Weckerly,

    2015-01-01

    Population viability analyses provide a quantitative approach that seeks to predict the possible future status of a species of interest under different scenarios and, therefore, can be important components of large-scale species’ conservation programs. We created a model and simulated range-wide population and breeding habitat dynamics for an endangered woodland warbler, the golden-cheeked warbler (Setophaga chrysoparia). Habitat-transition probabilities were estimated across the warbler's breeding range by combining National Land Cover Database imagery with multistate modeling. Using these estimates, along with recently published demographic estimates, we examined if the species can remain viable into the future given the current conditions. Lastly, we evaluated if protecting a greater amount of habitat would increase the number of warblers that can be supported in the future by systematically increasing the amount of protected habitat and comparing the estimated terminal carrying capacity at the end of 50 years of simulated habitat change. The estimated habitat-transition probabilities supported the hypothesis that habitat transitions are unidirectional, whereby habitat is more likely to diminish than regenerate. The model results indicated population viability could be achieved under current conditions, depending on dispersal. However, there is considerable uncertainty associated with the population projections due to parametric uncertainty. Model results suggested that increasing the amount of protected lands would have a substantial impact on terminal carrying capacities at the end of a 50-year simulation. Notably, this study identifies the need for collecting the data required to estimate demographic parameters in relation to changes in habitat metrics and population density in multiple regions, and highlights the importance of establishing a common definition of what constitutes protected habitat, what management goals are suitable within those protected

  19. Reduction methods and uncertainty analysis: application to a Chemistry-Transport Model for modeling and simulation of impacts

    International Nuclear Information System (INIS)

    Boutahar, Jaouad

    2004-01-01

    In an integrated impact assessment, one has to test several scenarios of the model inputs or/and to identify the effects of model input uncertainties on the model outputs. In both cases, a large number of simulations of the model is necessary. That of course is not feasible with comprehensive Chemistry-Transport Model, due to the need for huge CPU times. Two approaches may be used in order to circumvent these difficulties: The first approach consists in reducing the computational cost of the original model by building a reduced model. Two reduction techniques are used: the first method, POD, is related to the statistical behaviour of the system and is based on a proper orthogonal decomposition of the solutions. The second method, is an efficient representation of the input/output behaviour through look-up tables. It describes the output model as an expansion of finite hierarchical correlated function in terms of the input variables. The second approach is based on reducing the number of models runs required by the standard Monte Carlo methods. It characterizes the probabilistic response of the uncertain model output as an expansion of orthogonal polynomials according to model inputs uncertainties. Then the classical Monte Carlo simulation can easily be used to compute the probability density of the uncertain output. Another key point in an integrated impact assessment is to develop strategies for the reduction of emissions by computing Source/Receptor matrices for several years of simulations. We proposed here an efficient method to calculate these matrices by using the adjoint model and in particular by defining the 'representative chemical day'. All of these methods are applied to POLAIR3D, a Chemistry-Transport model developed in this thesis. (author) [fr

  20. GLOBAL RANDOM WALK SIMULATIONS FOR SENSITIVITY AND UNCERTAINTY ANALYSIS OF PASSIVE TRANSPORT MODELS

    Directory of Open Access Journals (Sweden)

    Nicolae Suciu

    2011-07-01

    Full Text Available The Global Random Walk algorithm (GRW performs a simultaneoustracking on a fixed grid of huge numbers of particles at costscomparable to those of a single-trajectory simulation by the traditional Particle Tracking (PT approach. Statistical ensembles of GRW simulations of a typical advection-dispersion process in groundwater systems with randomly distributed spatial parameters are used to obtain reliable estimations of the input parameters for the upscaled transport model and of their correlations, input-output correlations, as well as full probability distributions of the input and output parameters.

  1. Research on Multi Hydrological Models Applicability and Modelling Data Uncertainty Analysis for Flash Flood Simulation in Hilly Area

    Science.gov (United States)

    Ye, L.; Wu, J.; Wang, L.; Song, T.; Ji, R.

    2017-12-01

    Flooding in small-scale watershed in hilly area is characterized by short time periods and rapid rise and recession due to the complex underlying surfaces, various climate type and strong effect of human activities. It is almost impossible for a single hydrological model to describe the variation of flooding in both time and space accurately for all the catchments in hilly area because the hydrological characteristics can vary significantly among different catchments. In this study, we compare the performance of 5 hydrological models with varying degrees of complexity for simulation of flash flood for 14 small-scale watershed in China in order to find the relationship between the applicability of the hydrological models and the catchments characteristics. Meanwhile, given the fact that the hydrological data is sparse in hilly area, the effect of precipitation data, DEM resolution and their interference on the uncertainty of flood simulation is also illustrated. In general, the results showed that the distributed hydrological model (HEC-HMS in this study) performed better than the lumped hydrological models. Xinajiang and API models had good simulation for the humid catchments when long-term and continuous rainfall data is provided. Dahuofang model can simulate the flood peak well while the runoff generation module is relatively poor. In addition, the effect of diverse modelling data on the simulations is not simply superposed, and there is a complex interaction effect among different modelling data. Overall, both the catchment hydrological characteristics and modelling data situation should be taken into consideration in order to choose the suitable hydrological model for flood simulation for small-scale catchment in hilly area.

  2. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...

  3. Confronting uncertainty in model-based geostatistics using Markov Chain Monte Carlo simulation

    NARCIS (Netherlands)

    Minasny, B.; Vrugt, J.A.; McBratney, A.B.

    2011-01-01

    This paper demonstrates for the first time the use of Markov Chain Monte Carlo (MCMC) simulation for parameter inference in model-based soil geostatistics. We implemented the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm to jointly summarize the posterior

  4. Improving Chemical EOR Simulations and Reducing the Subsurface Uncertainty Using Downscaling Conditioned to Tracer Data

    KAUST Repository

    Torrealba, Victor A.; Hoteit, Hussein; Chawathe, Adwait

    2017-01-01

    and thermodynamic phase split, the impact of grid downscaling on CEOR simulations is not well understood. In this work, we introduce a geostatistical downscaling method conditioned to tracer data to refine a coarse history-matched WF model. This downscaling process

  5. Treatment of input uncertainty in hydrologic modeling: Doing hydrology backward with Markov chain Monte Carlo simulation

    NARCIS (Netherlands)

    Vrugt, J.A.; Braak, ter C.J.F.; Clark, M.P.; Hyman, J.M.; Robinson, B.A.

    2008-01-01

    There is increasing consensus in the hydrologic literature that an appropriate framework for streamflow forecasting and simulation should include explicit recognition of forcing and parameter and model structural error. This paper presents a novel Markov chain Monte Carlo (MCMC) sampler, entitled

  6. Structural Uncertainty in Model-Simulated Trends of Global Gross Primary Production

    Directory of Open Access Journals (Sweden)

    Zaichun Zhu

    2013-03-01

    Full Text Available Projected changes in the frequency and severity of droughts as a result of increase in greenhouse gases have a significant impact on the role of vegetation in regulating the global carbon cycle. Drought effect on vegetation Gross Primary Production (GPP is usually modeled as a function of Vapor Pressure Deficit (VPD and/or soil moisture. Climate projections suggest a strong likelihood of increasing trend in VPD, while regional changes in precipitation are less certain. This difference in projections between VPD and precipitation can cause considerable discrepancies in the predictions of vegetation behavior depending on how ecosystem models represent the drought effect. In this study, we scrutinized the model responses to drought using the 30-year record of Global Inventory Modeling and Mapping Studies (GIMMS 3g Normalized Difference Vegetation Index (NDVI dataset. A diagnostic ecosystem model, Terrestrial Observation and Prediction System (TOPS, was used to estimate global GPP from 1982 to 2009 under nine different experimental simulations. The control run of global GPP increased until 2000, but stayed constant after 2000. Among the simulations with single climate constraint (temperature, VPD, rainfall and solar radiation, only the VPD-driven simulation showed a decrease in 2000s, while the other scenarios simulated an increase in GPP. The diverging responses in 2000s can be attributed to the difference in the representation of the impact of water stress on vegetation in models, i.e., using VPD and/or precipitation. Spatial map of trend in simulated GPP using GIMMS 3g data is consistent with the GPP driven by soil moisture than the GPP driven by VPD, confirming the need for a soil moisture constraint in modeling global GPP.

  7. Uncertainty Quantification Analysis of Both Experimental and CFD Simulation Data of a Bench-scale Fluidized Bed Gasifier

    Energy Technology Data Exchange (ETDEWEB)

    Shahnam, Mehrdad [National Energy Technology Lab. (NETL), Morgantown, WV (United States). Research and Innovation Center, Energy Conversion Engineering Directorate; Gel, Aytekin [ALPEMI Consulting, LLC, Phoeniz, AZ (United States); Subramaniyan, Arun K. [GE Global Research Center, Niskayuna, NY (United States); Musser, Jordan [National Energy Technology Lab. (NETL), Morgantown, WV (United States). Research and Innovation Center, Energy Conversion Engineering Directorate; Dietiker, Jean-Francois [West Virginia Univ. Research Corporation, Morgantown, WV (United States)

    2017-10-02

    Adequate assessment of the uncertainties in modeling and simulation is becoming an integral part of the simulation based engineering design. The goal of this study is to demonstrate the application of non-intrusive Bayesian uncertainty quantification (UQ) methodology in multiphase (gas-solid) flows with experimental and simulation data, as part of our research efforts to determine the most suited approach for UQ of a bench scale fluidized bed gasifier. UQ analysis was first performed on the available experimental data. Global sensitivity analysis performed as part of the UQ analysis shows that among the three operating factors, steam to oxygen ratio has the most influence on syngas composition in the bench-scale gasifier experiments. An analysis for forward propagation of uncertainties was performed and results show that an increase in steam to oxygen ratio leads to an increase in H2 mole fraction and a decrease in CO mole fraction. These findings are in agreement with the ANOVA analysis performed in the reference experimental study. Another contribution in addition to the UQ analysis is the optimization-based approach to guide to identify next best set of additional experimental samples, should the possibility arise for additional experiments. Hence, the surrogate models constructed as part of the UQ analysis is employed to improve the information gain and make incremental recommendation, should the possibility to add more experiments arise. In the second step, series of simulations were carried out with the open-source computational fluid dynamics software MFiX to reproduce the experimental conditions, where three operating factors, i.e., coal flow rate, coal particle diameter, and steam-to-oxygen ratio, were systematically varied to understand their effect on the syngas composition. Bayesian UQ analysis was performed on the numerical results. As part of Bayesian UQ analysis, a global sensitivity analysis was performed based on the simulation results, which shows

  8. Addressing the mischaracterization of extreme rainfall in regional climate model simulations - A synoptic pattern based bias correction approach

    Science.gov (United States)

    Li, Jingwan; Sharma, Ashish; Evans, Jason; Johnson, Fiona

    2018-01-01

    Addressing systematic biases in regional climate model simulations of extreme rainfall is a necessary first step before assessing changes in future rainfall extremes. Commonly used bias correction methods are designed to match statistics of the overall simulated rainfall with observations. This assumes that change in the mix of different types of extreme rainfall events (i.e. convective and non-convective) in a warmer climate is of little relevance in the estimation of overall change, an assumption that is not supported by empirical or physical evidence. This study proposes an alternative approach to account for the potential change of alternate rainfall types, characterized here by synoptic weather patterns (SPs) using self-organizing maps classification. The objective of this study is to evaluate the added influence of SPs on the bias correction, which is achieved by comparing the corrected distribution of future extreme rainfall with that using conventional quantile mapping. A comprehensive synthetic experiment is first defined to investigate the conditions under which the additional information of SPs makes a significant difference to the bias correction. Using over 600,000 synthetic cases, statistically significant differences are found to be present in 46% cases. This is followed by a case study over the Sydney region using a high-resolution run of the Weather Research and Forecasting (WRF) regional climate model, which indicates a small change in the proportions of the SPs and a statistically significant change in the extreme rainfall over the region, although the differences between the changes obtained from the two bias correction methods are not statistically significant.

  9. A Numerical Approach for Hybrid Simulation of Power System Dynamics Considering Extreme Icing Events

    DEFF Research Database (Denmark)

    Chen, Lizheng; Zhang, Hengxu; Wu, Qiuwei

    2017-01-01

    numerical simulation scheme integrating icing weather events with power system dynamics is proposed to extend power system numerical simulation. A technique is developed to efficiently simulate the interaction of slow dynamics of weather events and fast dynamics of power systems. An extended package for PSS...

  10. Accreditation of a system of extremity dosimetry: validation and uncertainty of method; Acreditacion de un sistema de dosimetria de extremidades: validacion e incertidumbre del metodo

    Energy Technology Data Exchange (ETDEWEB)

    Romero Gutierrez, A. M.; Rodriguez Jimenez, R.; Lopez Moyano, J. L.

    2013-07-01

    The authors' goal is to spread the practical experience gained during the accreditation process paying special attention to the process of method validation and estimation uncertainty of the dosimetry system. (Author)

  11. Uncertainty analysis for effluent trading planning using a Bayesian estimation-based simulation-optimization modeling approach.

    Science.gov (United States)

    Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J

    2017-06-01

    In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision

  12. SIMULATION OF CARS ACCUMULATION PROCESSES FOR SOLVING TASKS OF OPERATIONAL PLANNING IN CONDITIONS OF INITIAL INFORMATION UNCERTAINTY

    Directory of Open Access Journals (Sweden)

    О. A. Tereshchenko

    2017-06-01

    Full Text Available Purpose. The article highlights development of the methodological basis for simulation the processes of cars accumulation in solving operational planning problems under conditions of initial information uncertainty for assessing the sustainability of the adopted planning scenario and calculating the associated technological risks. Methodology. The solution of the problem under investigation is based on the use of general scientific approaches, the apparatus of probability theory and the theory of fuzzy sets. To achieve this purpose, the factors influencing the entropy of operational plans are systematized. It is established that when planning the operational work of railway stations, sections and nodes, the most significant factors that cause uncertainty in the initial information are: a external conditions with respect to the railway ground in question, expressed by the uncertainty of the timing of cars arrivals; b external, hard-to-identify goals for the functioning of other participants in the logistics chain (primarily customers, expressed by the uncertainty of the completion time with the freight cars. These factors are suggested to be taken into account in automated planning through statistical analysis – the establishment and study of the remaining time (prediction errors. As a result, analytical dependencies are proposed for rational representation of the probability density functions of the time residual distribution in the form of point, piecewise-defined and continuous analytic models. The developed models of cars accumulation, the application of which depends on the identified states of the predicted incoming car flow to the accumulation system, are presented below. In addition, the last proposed model is a general case of models of accumulation processes with an arbitrary level of reliability of the initial information for any structure of the incoming flow of cars. In conclusion, a technique for estimating the results of

  13. Economic analysis of hydrocarbon exploration by simulation with geological uncertainties (exploratory wells)

    International Nuclear Information System (INIS)

    Chungcharoen, E.

    1997-01-01

    A model was developed to help determine the future development of hydrocarbon reserves. The uncertainties of geological parameters were incorporated into the model in an effort to provide an assessment of the distribution of total hydrocarbon discoveries that are expected to be recovered as a result of exploration activity. The economic parameters were also incorporated into the model in an effort to determine the economic worth of multiple-wells exploration activity. The first part of this study included the geological parameters in the initial field size distribution and the number of fields distribution. Dry hole data was also considered to reflect the exploration risk. The distribution of total hydrocarbon discoveries for a selected number of exploratory wells was determined. The second part of the study included the economic parameters such as the price of oil and gas and the cost of exploration, development and production. The distribution of the number of discoveries and the distribution of total hydrocarbon discoveries was compared to produce a probability distribution of the net present value of a proposed exploration program. The offshore Nova Scotia Shelf basin was chosen for testing the methodology. Several scenarios involving changes in economic parameters were shown. This methodology could help in determining future development programs for hydrocarbon reserves. The methodology can also help governments in policy making decisions regarding taxes and royalty regimes for exploration programs

  14. Observation-based Quantitative Uncertainty Estimation for Realtime Tsunami Inundation Forecast using ABIC and Ensemble Simulation

    Science.gov (United States)

    Takagawa, T.

    2016-12-01

    An ensemble forecasting scheme for tsunami inundation is presented. The scheme consists of three elemental methods. The first is a hierarchical Bayesian inversion using Akaike's Bayesian Information Criterion (ABIC). The second is Montecarlo sampling from a probability density function of multidimensional normal distribution. The third is ensamble analysis of tsunami inundation simulations with multiple tsunami sources. Simulation based validation of the model was conducted. A tsunami scenario of M9.1 Nankai earthquake was chosen as a target of validation. Tsunami inundation around Nagoya Port was estimated by using synthetic tsunami waveforms at offshore GPS buoys. The error of estimation of tsunami inundation area was about 10% even if we used only ten minutes observation data. The estimation accuracy of waveforms on/off land and spatial distribution of maximum tsunami inundation depth is demonstrated.

  15. Energy planning of a hospital using Mathematical Programming and Monte Carlo simulation for dealing with uncertainty in the economic parameters

    International Nuclear Information System (INIS)

    Mavrotas, George; Florios, Kostas; Vlachou, Dimitra

    2010-01-01

    For more than 40 years, Mathematical Programming is the traditional tool for energy planning at the national or regional level aiming at cost minimization subject to specific technological, political and demand satisfaction constraints. The liberalization of the energy market along with the ongoing technical progress increased the level of competition and forced energy consumers, even at the unit level, to make their choices among a large number of alternative or complementary energy technologies, fuels and/or suppliers. In the present work we develop a modelling framework for energy planning in units of the tertiary sector giving special emphasis to model reduction and to the uncertainty of the economic parameters. In the given case study, the energy rehabilitation of a hospital in Athens is examined and the installation of a cogeneration, absorption and compression unit is examined for the supply of the electricity, heating and cooling load. The basic innovation of the given energy model lies in the uncertainty modelling through the combined use of Mathematical Programming (namely, Mixed Integer Linear Programming, MILP) and Monte Carlo simulation that permits the risk management for the most volatile parameters of the objective function such as the fuel costs and the interest rate. The results come in the form of probability distributions that provide fruitful information to the decision maker. The effect of model reduction through appropriate data compression of the load data is also addressed.

  16. Distribution network design under demand uncertainty using genetic algorithm and Monte Carlo simulation approach: a case study in pharmaceutical industry

    Science.gov (United States)

    Izadi, Arman; Kimiagari, Ali Mohammad

    2014-05-01

    Distribution network design as a strategic decision has long-term effect on tactical and operational supply chain management. In this research, the location-allocation problem is studied under demand uncertainty. The purposes of this study were to specify the optimal number and location of distribution centers and to determine the allocation of customer demands to distribution centers. The main feature of this research is solving the model with unknown demand function which is suitable with the real-world problems. To consider the uncertainty, a set of possible scenarios for customer demands is created based on the Monte Carlo simulation. The coefficient of variation of costs is mentioned as a measure of risk and the most stable structure for firm's distribution network is defined based on the concept of robust optimization. The best structure is identified using genetic algorithms and 14 % reduction in total supply chain costs is the outcome. Moreover, it imposes the least cost variation created by fluctuation in customer demands (such as epidemic diseases outbreak in some areas of the country) to the logistical system. It is noteworthy that this research is done in one of the largest pharmaceutical distribution firms in Iran.

  17. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    Science.gov (United States)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  18. Identifying climate analogues for precipitation extremes for Denmark based on RCM simulations from the ENSEMBLES database.

    Science.gov (United States)

    Arnbjerg-Nielsen, K; Funder, S G; Madsen, H

    2015-01-01

    Climate analogues, also denoted Space-For-Time, may be used to identify regions where the present climatic conditions resemble conditions of a past or future state of another location or region based on robust climate variable statistics in combination with projections of how these statistics change over time. The study focuses on assessing climate analogues for Denmark based on current climate data set (E-OBS) observations as well as the ENSEMBLES database of future climates with the aim of projecting future precipitation extremes. The local present precipitation extremes are assessed by means of intensity-duration-frequency curves for urban drainage design for the relevant locations being France, the Netherlands, Belgium, Germany, the United Kingdom, and Denmark. Based on this approach projected increases of extreme precipitation by 2100 of 9 and 21% are expected for 2 and 10 year return periods, respectively. The results should be interpreted with caution as the best region to represent future conditions for Denmark is the coastal areas of Northern France, for which only little information is available with respect to present precipitation extremes.

  19. Simulating future uncertainty to guide the selection of survey designs for long-term monitoring

    Science.gov (United States)

    Garman, Steven L.; Schweiger, E. William; Manier, Daniel J.; Gitzen, Robert A.; Millspaugh, Joshua J.; Cooper, Andrew B.; Licht, Daniel S.

    2012-01-01

    A goal of environmental monitoring is to provide sound information on the status and trends of natural resources (Messer et al. 1991, Theobald et al. 2007, Fancy et al. 2009). When monitoring observations are acquired by measuring a subset of the population of interest, probability sampling as part of a well-constructed survey design provides the most reliable and legally defensible approach to achieve this goal (Cochran 1977, Olsen et al. 1999, Schreuder et al. 2004; see Chapters 2, 5, 6, 7). Previous works have described the fundamentals of sample surveys (e.g. Hansen et al. 1953, Kish 1965). Interest in survey designs and monitoring over the past 15 years has led to extensive evaluations and new developments of sample selection methods (Stevens and Olsen 2004), of strategies for allocating sample units in space and time (Urquhart et al. 1993, Overton and Stehman 1996, Urquhart and Kincaid 1999), and of estimation (Lesser and Overton 1994, Overton and Stehman 1995) and variance properties (Larsen et al. 1995, Stevens and Olsen 2003) of survey designs. Carefully planned, “scientific” (Chapter 5) survey designs have become a standard in contemporary monitoring of natural resources. Based on our experience with the long-term monitoring program of the US National Park Service (NPS; Fancy et al. 2009; Chapters 16, 22), operational survey designs tend to be selected using the following procedures. For a monitoring indicator (i.e. variable or response), a minimum detectable trend requirement is specified, based on the minimum level of change that would result in meaningful change (e.g. degradation). A probability of detecting this trend (statistical power) and an acceptable level of uncertainty (Type I error; see Chapter 2) within a specified time frame (e.g. 10 years) are specified to ensure timely detection. Explicit statements of the minimum detectable trend, the time frame for detecting the minimum trend, power, and acceptable probability of Type I error (

  20. Uncertainty estimates of a GRACE inversion modelling technique over Greenland using a simulation

    Science.gov (United States)

    Bonin, Jennifer; Chambers, Don

    2013-07-01

    The low spatial resolution of GRACE causes leakage, where signals in one location spread out into nearby regions. Because of this leakage, using simple techniques such as basin averages may result in an incorrect estimate of the true mass change in a region. A fairly simple least squares inversion technique can be used to more specifically localize mass changes into a pre-determined set of basins of uniform internal mass distribution. However, the accuracy of these higher resolution basin mass amplitudes has not been determined, nor is it known how the distribution of the chosen basins affects the results. We use a simple `truth' model over Greenland as an example case, to estimate the uncertainties of this inversion method and expose those design parameters which may result in an incorrect high-resolution mass distribution. We determine that an appropriate level of smoothing (300-400 km) and process noise (0.30 cm2 of water) gets the best results. The trends of the Greenland internal basins and Iceland can be reasonably estimated with this method, with average systematic errors of 3.5 cm yr-1 per basin. The largest mass losses found from GRACE RL04 occur in the coastal northwest (-19.9 and -33.0 cm yr-1) and southeast (-24.2 and -27.9 cm yr-1), with small mass gains (+1.4 to +7.7 cm yr-1) found across the northern interior. Acceleration of mass change is measurable at the 95 per cent confidence level in four northwestern basins, but not elsewhere in Greenland. Due to an insufficiently detailed distribution of basins across internal Canada, the trend estimates of Baffin and Ellesmere Islands are expected to be incorrect due to systematic errors caused by the inversion technique.

  1. Assessing the Impact of Model Parameter Uncertainty in Simulating Grass Biomass Using a Hybrid Carbon Allocation Strategy

    Science.gov (United States)

    Reyes, J. J.; Adam, J. C.; Tague, C.

    2016-12-01

    Grasslands play an important role in agricultural production as forage for livestock; they also provide a diverse set of ecosystem services including soil carbon (C) storage. The partitioning of C between above and belowground plant compartments (i.e. allocation) is influenced by both plant characteristics and environmental conditions. The objectives of this study are to 1) develop and evaluate a hybrid C allocation strategy suitable for grasslands, and 2) apply this strategy to examine the importance of various parameters related to biogeochemical cycling, photosynthesis, allocation, and soil water drainage on above and belowground biomass. We include allocation as an important process in quantifying the model parameter uncertainty, which identifies the most influential parameters and what processes may require further refinement. For this, we use the Regional Hydro-ecologic Simulation System, a mechanistic model that simulates coupled water and biogeochemical processes. A Latin hypercube sampling scheme was used to develop parameter sets for calibration and evaluation of allocation strategies, as well as parameter uncertainty analysis. We developed the hybrid allocation strategy to integrate both growth-based and resource-limited allocation mechanisms. When evaluating the new strategy simultaneously for above and belowground biomass, it produced a larger number of less biased parameter sets: 16% more compared to resource-limited and 9% more compared to growth-based. This also demonstrates its flexible application across diverse plant types and environmental conditions. We found that higher parameter importance corresponded to sub- or supra-optimal resource availability (i.e. water, nutrients) and temperature ranges (i.e. too hot or cold). For example, photosynthesis-related parameters were more important at sites warmer than the theoretical optimal growth temperature. Therefore, larger values of parameter importance indicate greater relative sensitivity in

  2. Novel patch modelling method for efficient simulation and prediction uncertainty analysis of multi-scale groundwater flow and transport processes

    Science.gov (United States)

    Sreekanth, J.; Moore, Catherine

    2018-04-01

    The application of global sensitivity and uncertainty analysis techniques to groundwater models of deep sedimentary basins are typically challenged by large computational burdens combined with associated numerical stability issues. The highly parameterized approaches required for exploring the predictive uncertainty associated with the heterogeneous hydraulic characteristics of multiple aquifers and aquitards in these sedimentary basins exacerbate these issues. A novel Patch Modelling Methodology is proposed for improving the computational feasibility of stochastic modelling analysis of large-scale and complex groundwater models. The method incorporates a nested groundwater modelling framework that enables efficient simulation of groundwater flow and transport across multiple spatial and temporal scales. The method also allows different processes to be simulated within different model scales. Existing nested model methodologies are extended by employing 'joining predictions' for extrapolating prediction-salient information from one model scale to the next. This establishes a feedback mechanism supporting the transfer of information from child models to parent models as well as parent models to child models in a computationally efficient manner. This feedback mechanism is simple and flexible and ensures that while the salient small scale features influencing larger scale prediction are transferred back to the larger scale, this does not require the live coupling of models. This method allows the modelling of multiple groundwater flow and transport processes using separate groundwater models that are built for the appropriate spatial and temporal scales, within a stochastic framework, while also removing the computational burden associated with live model coupling. The utility of the method is demonstrated by application to an actual large scale aquifer injection scheme in Australia.

  3. Selecting the Most Economic Project under Uncertainty Using Bootstrap Technique and Fuzzy Simulation

    Directory of Open Access Journals (Sweden)

    Kamran Shahanaghi

    2012-01-01

    Full Text Available This article, by leaving pre-determined membership function of a fuzzy set which is a basic assumption for such subject, will try to propose a hybrid technique to select the most economic project among alternative projects in fuzziness interest rates condition. In this way, net present worth (NPW would be the economic indicator. This article tries to challenge the assumption of large sample sizes availability for membership function determination and shows that some other techniques may have less accuracy. To give a robust solution, bootstrapping and fuzzy simulation is suggested and a numerical example is given and analyzed.

  4. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    Science.gov (United States)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  5. Investigating the Effects of Simulated Space conditions on Novel Extremely Halophilic Archaea: Halovarius Luteus gen. nov., sp. nov.

    Science.gov (United States)

    Feshangsaz, Niloofar; Van Loon, ing.. Jack J. W. A.; Nazmi, Kamran; Semsarha, Farid

    2016-07-01

    Studying halophiles from different environments of Earth provide new insights into our search for life in the universe. Haloarchaea show some unique characteristics and physiological adaptations like acidic proteins against harsh environments such as natural brine with salt concentration approaching saturation (5 M) and regions with low active water. These properties make haloarchaea interesting candidate for astrobiological studies. Halovarius luteus gen. nov., sp. nov. a novel extremely halophilic archaeon from Urmia salt lake, in Iran has been chosen to explore its resistance against a series of extreme conditions. The aim of this study is to assess the resistance of strain DA50T under the effects of simulated space conditions like simulated microgravity, hypergravity, and desiccation. In this paper we will discuss the results of these studies where we specifically focus on changes in carotenoid pigments production and whole cell proteome. This is the first report of very novel Iranian archaea in response to extreme space conditions. The pigments were extracted by acetone and methanol. Pigments were analyzed by scanning the absorbance spectrum in the UV-VIS spectrophotometer. And they were separated by TLC. Whole protein from cell lysate supernatant was extracted after lysis with Bacterial Protein Extraction Reagent and fractionated by RP-HPLC using C18 column. Proteome analyzed by electrophoresis (SDS-PAGE), and MALDI-TOF. Carotenoid pigments are formed under different extreme conditions such as dry environment and gravitational changes. Also the protein composition exhibits alterations after exposure to the same conditions. Our conclusion is that pigments and proteins formation depend on the growth circumstances. Halophiles use this as an adaptation to survive under different environmental conditions.

  6. A Simulation Based Approach to Optimize Berth Throughput Under Uncertainty at Marine Container Terminals

    Science.gov (United States)

    Golias, Mihalis M.

    2011-01-01

    Berth scheduling is a critical function at marine container terminals and determining the best berth schedule depends on several factors including the type and function of the port, size of the port, location, nearby competition, and type of contractual agreement between the terminal and the carriers. In this paper we formulate the berth scheduling problem as a bi-objective mixed-integer problem with the objective to maximize customer satisfaction and reliability of the berth schedule under the assumption that vessel handling times are stochastic parameters following a discrete and known probability distribution. A combination of an exact algorithm, a Genetic Algorithms based heuristic and a simulation post-Pareto analysis is proposed as the solution approach to the resulting problem. Based on a number of experiments it is concluded that the proposed berth scheduling policy outperforms the berth scheduling policy where reliability is not considered.

  7. Process simulation and uncertainty analysis of plasma arc mixed waste treatment

    International Nuclear Information System (INIS)

    Ferrada, J.J.; Welch, T.D.

    1994-01-01

    Innovative mixed waste treatment subsystems have been analyzed for performance, risk, and life-cycle cost as part of the U.S. Department of Energy's (DOE)'s Mixed Waste Integrated Program (MWIP) treatment alternatives development and evaluation process. This paper concerns the analysis of mixed waste treatment system performance. Performance systems analysis includes approximate material and energy balances and assessments of operability, effectiveness, and reliability. Preliminary material and energy balances of innovative processes have been analyzed using FLOW, an object-oriented, process simulator for waste management systems under development at Oak Ridge National Laboratory. The preliminary models developed for FLOW provide rough order-of-magnitude calculations useful for sensitivity analysis. The insight gained from early modeling of these technologies approximately will ease the transition to more sophisticated simulators as adequate performance and property data become available. Such models are being developed in ASPEN by DOE's Mixed Waste Treatment Project (MWTP) for baseline and alternative flow sheets based on commercial technologies. One alternative to the baseline developed by the MWIP support groups in plasma arc treatment. This process offers a noticeable reduction in the number of process operations as compared to the baseline process because a plasma arc melter is capable of accepting a wide variety of waste streams as direct inputs (without sorting or preprocessing). This innovative process for treating mixed waste replaces several units from the baseline process and, thus, promises an economic advantage. The performance in the plasma arc furnace will directly affect the quality of the waste form and the requirements of the off-gas treatment units. The ultimate objective of MWIP is to reduce the amount of final waste produced, the cost, and the environmental impact

  8. Propagation of uncertainty by Monte Carlo simulations in case of basic geodetic computations

    Directory of Open Access Journals (Sweden)

    Wyszkowska Patrycja

    2017-12-01

    Full Text Available The determination of the accuracy of functions of measured or adjusted values may be a problem in geodetic computations. The general law of covariance propagation or in case of the uncorrelated observations the propagation of variance (or the Gaussian formula are commonly used for that purpose. That approach is theoretically justified for the linear functions. In case of the non-linear functions, the first-order Taylor series expansion is usually used but that solution is affected by the expansion error. The aim of the study is to determine the applicability of the general variance propagation law in case of the non-linear functions used in basic geodetic computations. The paper presents errors which are a result of negligence of the higher-order expressions and it determines the range of such simplification. The basis of that analysis is the comparison of the results obtained by the law of propagation of variance and the probabilistic approach, namely Monte Carlo simulations. Both methods are used to determine the accuracy of the following geodetic computations: the Cartesian coordinates of unknown point in the three-point resection problem, azimuths and distances of the Cartesian coordinates, height differences in the trigonometric and the geometric levelling. These simulations and the analysis of the results confirm the possibility of applying the general law of variance propagation in basic geodetic computations even if the functions are non-linear. The only condition is the accuracy of observations, which cannot be too low. Generally, this is not a problem with using present geodetic instruments.

  9. Propagation of uncertainty by Monte Carlo simulations in case of basic geodetic computations

    Science.gov (United States)

    Wyszkowska, Patrycja

    2017-12-01

    The determination of the accuracy of functions of measured or adjusted values may be a problem in geodetic computations. The general law of covariance propagation or in case of the uncorrelated observations the propagation of variance (or the Gaussian formula) are commonly used for that purpose. That approach is theoretically justified for the linear functions. In case of the non-linear functions, the first-order Taylor series expansion is usually used but that solution is affected by the expansion error. The aim of the study is to determine the applicability of the general variance propagation law in case of the non-linear functions used in basic geodetic computations. The paper presents errors which are a result of negligence of the higher-order expressions and it determines the range of such simplification. The basis of that analysis is the comparison of the results obtained by the law of propagation of variance and the probabilistic approach, namely Monte Carlo simulations. Both methods are used to determine the accuracy of the following geodetic computations: the Cartesian coordinates of unknown point in the three-point resection problem, azimuths and distances of the Cartesian coordinates, height differences in the trigonometric and the geometric levelling. These simulations and the analysis of the results confirm the possibility of applying the general law of variance propagation in basic geodetic computations even if the functions are non-linear. The only condition is the accuracy of observations, which cannot be too low. Generally, this is not a problem with using present geodetic instruments.

  10. Characteristics of sub-daily precipitation extremes in observed data and regional climate model simulations

    Czech Academy of Sciences Publication Activity Database

    Beranová, Romana; Kyselý, Jan; Hanel, M.

    2018-01-01

    Roč. 132, 1-2 (2018), s. 515-527 ISSN 0177-798X R&D Projects: GA ČR(CZ) GA14-18675S Institutional support: RVO:68378289 Keywords : sub-daily precipitation * regional climate models * extremes * Czech Republic Subject RIV: DG - Athmosphere Sciences, Meteorology OBOR OECD: Meteorology and atmospheric sciences Impact factor: 2.640, year: 2016 https://link.springer.com/article/10.1007/s00704-017-2102-0

  11. Extreme Mississippi River Floods in the Late Holocene: Reconstructions and Simulations

    Science.gov (United States)

    Munoz, S. E.; Giosan, L.; Donnelly, J. P.; Dee, S.

    2016-12-01

    Extreme flooding of the Mississippi River is costly in both economic and social terms. Despite ambitious engineering projects conceived in the early 20th century to mitigate damage from extreme floods, economic losses due to flooding have increased over recent years. Forecasting extreme flood occurrence over seasonal or longer time-scales remains a major challenge - especially in light of shifts in hydroclimatic conditions expected in response to continued greenhouse forcing. Here, we present findings from a series of paleoflood records that span the late Holocene derived from laminated sediments deposited in abandoned channels of the Mississippi River. These sedimentary archives record individual overbank floods as unique events beds with upward fining that we identify using grain-size analysis, bulk geochemistry, and radiography. We use sedimentological characteristics to reconstruct flood magnitude by calibrating our records against instrumental streamflow data from nearby gauging stations. We also use the Last Millennium Experiments of the Community Earth System Model (CESM-LME) and historical reanalysis data to examine the state of climate system around river discharge extremes. Our paleo-flood records exhibit strong non-stationarities in flood frequency and magnitude that are associated with fluctuations in the frequency of the El Niño-Southern Oscillation (ENSO), because the warm ENSO phase is associated with increased surface water storage of the lower Mississippi basin that leads to enhanced runoff delivery to the main channel. We also show that the early 20th century was a period of anomalously high flood frequency and magnitude due to the combined effects of river engineering and natural climate variability. Our findings imply that flood risk along the lower Mississippi River is tightly coupled to the frequency of ENSO, highlighting the need for robust projections of ENSO variability under greenhouse warming.

  12. Simulations of Sulfate-Nitrate-Ammonium (SNA) aerosols during the extreme haze events over Northern China in 2014

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Dan; Liu, Zhiquan; Fast, Jerome D.; Ban, Junmei

    2016-08-30

    Extreme haze events have occurred frequently over China in recent years. Although many studies have investigated the formation mechanisms associated with PM2.5 for heavily polluted regions in China based on observational data, adequately predicting peak PM2.5 concentrations is still challenging for regional air quality models. In this study, we evaluate the performance of one configuration of the Weather Research and Forecasting model coupled with chemistry (WRF-Chem) and use the model to investigate the sensitivity of heterogeneous reactions on simulated peak sulfate, nitrate, and ammonium concentrations in the vicinity of Beijing during four extreme haze episodes in October 2014 over the North China Plain. The highest observed PM2.5 concentration of 469 μg m-3 occurred in Beijing. Comparisons with observations show that the model reproduced the temporal variability in PM2.5 with the highest PM2.5 values on polluted days (defined as days in which observed PM2.5 is greater than 75 μg m-3), but predictions of sulfate, nitrate, and ammonium were too low on days with the highest observed concentrations. Observational data indicate that the sulfur/nitric oxidation rates are strongly correlated with relative humidity during periods of peak PM2.5; however, the model failed to reproduce the highest PM2.5 concentrations due to missing heterogeneous reactions. As the parameterizations of those reactions is not well established yet, estimates of SO2-to-H2SO4 and NO2/NO3-to-HNO3 reaction rates that depend on relative humidity were applied which improved the simulation of sulfate, nitrate, and ammonium enhancement on polluted days in terms of both concentrations and partitioning among those species. Sensitivity simulations showed that the extremely high heterogeneous reaction rates and also higher emission rates than those reported in the emission inventory

  13. 3-D simulations of M9 earthquakes on the Cascadia Megathrust: Key parameters and uncertainty

    Science.gov (United States)

    Wirth, Erin; Frankel, Arthur; Vidale, John; Marafi, Nasser A.; Stephenson, William J.

    2017-01-01

    Geologic and historical records indicate that the Cascadia subduction zone is capable of generating large, megathrust earthquakes up to magnitude 9. The last great Cascadia earthquake occurred in 1700, and thus there is no direct measure on the intensity of ground shaking or specific rupture parameters from seismic recordings. We use 3-D numerical simulations to generate broadband (0-10 Hz) synthetic seismograms for 50 M9 rupture scenarios on the Cascadia megathrust. Slip consists of multiple high-stress drop subevents (~M8) with short rise times on the deeper portion of the fault, superimposed on a background slip distribution with longer rise times. We find a >4x variation in the intensity of ground shaking depending upon several key parameters, including the down-dip limit of rupture, the slip distribution and location of strong-motion-generating subevents, and the hypocenter location. We find that extending the down-dip limit of rupture to the top of the non-volcanic tremor zone results in a ~2-3x increase in peak ground acceleration for the inland city of Seattle, Washington, compared to a completely offshore rupture. However, our simulations show that allowing the rupture to extend to the up-dip limit of tremor (i.e., the deepest rupture extent in the National Seismic Hazard Maps), even when tapering the slip to zero at the down-dip edge, results in multiple areas of coseismic coastal uplift. This is inconsistent with coastal geologic evidence (e.g., buried soils, submerged forests), which suggests predominantly coastal subsidence for the 1700 earthquake and previous events. Defining the down-dip limit of rupture as the 1 cm/yr locking contour (i.e., mostly offshore) results in primarily coseismic subsidence at coastal sites. We also find that the presence of deep subevents can produce along-strike variations in subsidence and ground shaking along the coast. Our results demonstrate the wide range of possible ground motions from an M9 megathrust earthquake in

  14. The effect of geological uncertainty on achieving short-term targets : A quantitative approach using Stochastic process simulation

    NARCIS (Netherlands)

    Soleymani Shishvan, M.; Benndorf, J.

    Continuous mining systems containing multiple excavators producing multiple products of raw materials are highly complex, exhibiting strong interdependency between constituents. Furthermore, random variables govern the system, which causes uncertainty in the supply of raw materials: uncertainty in

  15. Uncertainty quantification of cinematic imaging for development of predictive simulations of turbulent combustion.

    Energy Technology Data Exchange (ETDEWEB)

    Lawson, Matthew; Debusschere, Bert J.; Najm, Habib N.; Sargsyan, Khachik; Frank, Jonathan H.

    2010-09-01

    Recent advances in high frame rate complementary metal-oxide-semiconductor (CMOS) cameras coupled with high repetition rate lasers have enabled laser-based imaging measurements of the temporal evolution of turbulent reacting flows. This measurement capability provides new opportunities for understanding the dynamics of turbulence-chemistry interactions, which is necessary for developing predictive simulations of turbulent combustion. However, quantitative imaging measurements using high frame rate CMOS cameras require careful characterization of the their noise, non-linear response, and variations in this response from pixel to pixel. We develop a noise model and calibration tools to mitigate these problems and to enable quantitative use of CMOS cameras. We have demonstrated proof of principle for image de-noising using both wavelet methods and Bayesian inference. The results offer new approaches for quantitative interpretation of imaging measurements from noisy data acquired with non-linear detectors. These approaches are potentially useful in many areas of scientific research that rely on quantitative imaging measurements.

  16. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    Science.gov (United States)

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  17. A non-linear and stochastic response surface method for Bayesian estimation of uncertainty in soil moisture simulation from a land surface model

    Directory of Open Access Journals (Sweden)

    F. Hossain

    2004-01-01

    Full Text Available This study presents a simple and efficient scheme for Bayesian estimation of uncertainty in soil moisture simulation by a Land Surface Model (LSM. The scheme is assessed within a Monte Carlo (MC simulation framework based on the Generalized Likelihood Uncertainty Estimation (GLUE methodology. A primary limitation of using the GLUE method is the prohibitive computational burden imposed by uniform random sampling of the model's parameter distributions. Sampling is improved in the proposed scheme by stochastic modeling of the parameters' response surface that recognizes the non-linear deterministic behavior between soil moisture and land surface parameters. Uncertainty in soil moisture simulation (model output is approximated through a Hermite polynomial chaos expansion of normal random variables that represent the model's parameter (model input uncertainty. The unknown coefficients of the polynomial are calculated using limited number of model simulation runs. The calibrated polynomial is then used as a fast-running proxy to the slower-running LSM to predict the degree of representativeness of a randomly sampled model parameter set. An evaluation of the scheme's efficiency in sampling is made through comparison with the fully random MC sampling (the norm for GLUE and the nearest-neighborhood sampling technique. The scheme was able to reduce computational burden of random MC sampling for GLUE in the ranges of 10%-70%. The scheme was also found to be about 10% more efficient than the nearest-neighborhood sampling method in predicting a sampled parameter set's degree of representativeness. The GLUE based on the proposed sampling scheme did not alter the essential features of the uncertainty structure in soil moisture simulation. The scheme can potentially make GLUE uncertainty estimation for any LSM more efficient as it does not impose any additional structural or distributional assumptions.

  18. Simulation of CO2 Sequestration at Rock Spring Uplift, Wyoming: Heterogeneity and Uncertainties in Storage Capacity, Injectivity and Leakage

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Hailin [Los Alamos National Laboratory; Dai, Zhenxue [Los Alamos National Laboratory; Jiao, Zunsheng [Wyoming State Geological Survey; Stauffer, Philip H. [Los Alamos National Laboratory; Surdam, Ronald C. [Wyoming State Geological Survey

    2011-01-01

    Many geological, geochemical, geomechanical and hydrogeological factors control CO{sub 2} storage in subsurface. Among them heterogeneity in saline aquifer can seriously influence design of injection wells, CO{sub 2} injection rate, CO{sub 2} plume migration, storage capacity, and potential leakage and risk assessment. This study applies indicator geostatistics, transition probability and Markov chain model at the Rock Springs Uplift, Wyoming generating facies-based heterogeneous fields for porosity and permeability in target saline aquifer (Pennsylvanian Weber sandstone) and surrounding rocks (Phosphoria, Madison and cap-rock Chugwater). A multiphase flow simulator FEHM is then used to model injection of CO{sub 2} into the target saline aquifer involving field-scale heterogeneity. The results reveal that (1) CO{sub 2} injection rates in different injection wells significantly change with local permeability distributions; (2) brine production rates in different pumping wells are also significantly impacted by the spatial heterogeneity in permeability; (3) liquid pressure evolution during and after CO{sub 2} injection in saline aquifer varies greatly for different realizations of random permeability fields, and this has potential important effects on hydraulic fracturing of the reservoir rock, reactivation of pre-existing faults and the integrity of the cap-rock; (4) CO{sub 2} storage capacity estimate for Rock Springs Uplift is 6614 {+-} 256 Mt at 95% confidence interval, which is about 36% of previous estimate based on homogeneous and isotropic storage formation; (5) density profiles show that the density of injected CO{sub 2} below 3 km is close to that of the ambient brine with given geothermal gradient and brine concentration, which indicates CO{sub 2} plume can sink to the deep before reaching thermal equilibrium with brine. Finally, we present uncertainty analysis of CO{sub 2} leakage into overlying formations due to heterogeneity in both the target saline

  19. Impact of biogenic emission uncertainties on the simulated response of ozone and fine particulate matter to anthropogenic emission reductions.

    Science.gov (United States)

    Hogrefe, Christian; Isukapalli, Sastry S; Tang, Xiaogang; Georgopoulos, Panos G; He, Shan; Zalewsky, Eric E; Hao, Winston; Ku, Jia-Yeong; Key, Tonalee; Sistla, Gopal

    2011-01-01

    The role of emissions of volatile organic compounds and nitric oxide from biogenic sources is becoming increasingly important in regulatory air quality modeling as levels of anthropogenic emissions continue to decrease and stricter health-based air quality standards are being adopted. However, considerable uncertainties still exist in the current estimation methodologies for biogenic emissions. The impact of these uncertainties on ozone and fine particulate matter (PM2.5) levels for the eastern United States was studied, focusing on biogenic emissions estimates from two commonly used biogenic emission models, the Model of Emissions of Gases and Aerosols from Nature (MEGAN) and the Biogenic Emissions Inventory System (BEIS). Photochemical grid modeling simulations were performed for two scenarios: one reflecting present day conditions and the other reflecting a hypothetical future year with reductions in emissions of anthropogenic oxides of nitrogen (NOx). For ozone, the use of MEGAN emissions resulted in a higher ozone response to hypothetical anthropogenic NOx emission reductions compared with BEIS. Applying the current U.S. Environmental Protection Agency guidance on regulatory air quality modeling in conjunction with typical maximum ozone concentrations, the differences in estimated future year ozone design values (DVF) stemming from differences in biogenic emissions estimates were on the order of 4 parts per billion (ppb), corresponding to approximately 5% of the daily maximum 8-hr ozone National Ambient Air Quality Standard (NAAQS) of 75 ppb. For PM2.5, the differences were 0.1-0.25 microg/m3 in the summer total organic mass component of DVFs, corresponding to approximately 1-2% of the value of the annual PM2.5 NAAQS of 15 microg/m3. Spatial variations in the ozone and PM2.5 differences also reveal that the impacts of different biogenic emission estimates on ozone and PM2.5 levels are dependent on ambient levels of anthropogenic emissions.

  20. A formal statistical approach to representing uncertainty in rainfall-runoff modelling with focus on residual analysis and probabilistic output evaluation - Distinguishing simulation and prediction

    DEFF Research Database (Denmark)

    Breinholt, Anders; Møller, Jan Kloppenborg; Madsen, Henrik

    2012-01-01

    While there seems to be consensus that hydrological model outputs should be accompanied with an uncertainty estimate the appropriate method for uncertainty estimation is not agreed upon and a debate is ongoing between advocators of formal statistical methods who consider errors as stochastic...... and GLUE advocators who consider errors as epistemic, arguing that the basis of formal statistical approaches that requires the residuals to be stationary and conform to a statistical distribution is unrealistic. In this paper we take a formal frequentist approach to parameter estimation and uncertainty...... necessary but the statistical assumptions were nevertheless not 100% justified. The residual analysis showed that significant autocorrelation was present for all simulation models. We believe users of formal approaches to uncertainty evaluation within hydrology and within environmental modelling in general...

  1. Robust Adaptation? Assessing the sensitivity of safety margins in flood defences to uncertainty in future simulations - a case study from Ireland.

    Science.gov (United States)

    Murphy, Conor; Bastola, Satish; Sweeney, John

    2013-04-01

    Climate change impact and adaptation assessments have traditionally adopted a 'top-down' scenario based approach, where information from different Global Climate Models (GCMs) and emission scenarios are employed to develop impacts led adaptation strategies. Due to the tradeoffs in the computational cost and need to include a wide range of GCMs for fuller characterization of uncertainties, scenarios are better used for sensitivity testing and adaptation options appraisal. One common approach to adaptation that has been defined as robust is the use of safety margins. In this work the sensitivity of safety margins that have been adopted by the agency responsible for flood risk management in Ireland, to the uncertainty in future projections are examined. The sensitivity of fluvial flood risk to climate change is assessed for four Irish catchments using a large number of GCMs (17) forced with three emissions scenarios (SRES A1B, A2, B1) as input to four hydrological models. Both uncertainty within and between hydrological models is assessed using the GLUE framework. Regionalisation is achieved using a change factor method to infer changes in the parameters of a weather generator using monthly output from the GCMs, while flood frequency analysis is conducted using the method of probability weighted moments to fit the Generalised Extreme Value distribution to ~20,000 annual maxima series. The sensitivity of design margins to the uncertainty space considered is visualised using risk response surfaces. The hydrological sensitivity is measured as the percentage change in flood peak for specified recurrence intervals. Results indicate that there is a considerable residual risk associated with allowances of +20% when uncertainties are accounted for and that the risk of exceedence of design allowances is greatest for more extreme, low frequency events with considerable implication for critical infrastructure, e.g., culverts, bridges, flood defences whose designs are normally

  2. Characteristic changes in heat extremes over India in response to global warming using CMIP5 model simulations

    Science.gov (United States)

    Kundeti, K.; Chang, H. H.; T V, L. K.; Desamsetti, S.; Dandi, A. R.

    2017-12-01

    A critical aspect of human-induced climate change is how it will affect climatological mean and extremes around the world. Summer season surface climate of the Indian sub continent is characterized by hot and humid conditions. The global warming can have profound impact on the mean climate as well as extreme weather events over India that may affect both natural and human systems significantly. In this study we examine very direct measure of the impact of climate change on human health and comfort. The Heat stress Index is the measure of combined effects of temperature and atmospheric moisture on the ability of the human body to dissipate heat. It is important to assess the future changes in the seasonal mean of heat stress index, it is also desirable to know how the future holds when it comes to extremes in temperature for a country like India where so much of outdoor activities happen both in the onshore/offshore energy sectors, extensive construction activities. This study assesses the performance of the Coupled Model Inter comparison Project Phase 5 (CMIP5) simulations in the present and develops future climate scenarios. The changes in heat extremes are assessed for three future periods 2016-2035, 2046-2065 and 2080-2099 with respect to 1986-2005 (base line) under two RCP's (Representative Concentrate Pathways) - RCP4.5 and RCP8.5. In view of this, we provide the expected future changes in the seasonal mean heat stress indices and also the frequency of heat stress exceeding a certain threshold relevant to Inida. Besides, we provide spatial maps of expected future changes in the heat stress index derived as a function of daily mean temperature and relative humidity and representative of human comfort having a direct bearing on the human activities. The observations show an increase in heat extremes over many parts in this region that are generally well captured by the models. The results indicate a significant change in frequency and intensity of heat extremes

  3. Properties of a planar electric double layer under extreme conditions investigated by classical density functional theory and Monte Carlo simulations.

    Science.gov (United States)

    Zhou, Shiqi; Lamperski, Stanisław; Zydorczak, Maria

    2014-08-14

    Monte Carlo (MC) simulation and classical density functional theory (DFT) results are reported for the structural and electrostatic properties of a planar electric double layer containing ions having highly asymmetric diameters or valencies under extreme concentration condition. In the applied DFT, for the excess free energy contribution due to the hard sphere repulsion, a recently elaborated extended form of the fundamental measure functional is used, and coupling of Coulombic and short range hard-sphere repulsion is described by a traditional second-order functional perturbation expansion approximation. Comparison between the MC and DFT results indicates that validity interval of the traditional DFT approximation expands to high ion valences running up to 3 and size asymmetry high up to diameter ratio of 4 whether the high valence ions or the large size ion are co- or counter-ions; and to a high bulk electrolyte concentration being close to the upper limit of the electrolyte mole concentration the MC simulation can deal with well. The DFT accuracy dependence on the ion parameters can be self-consistently explained using arguments of liquid state theory, and new EDL phenomena such as overscreening effect due to monovalent counter-ions, extreme layering effect of counter-ions, and appearance of a depletion layer with almost no counter- and co-ions are observed.

  4. The effect of asymmetrical body orientation during simulated forward falls on the distal upper extremity impact response of healthy people.

    Science.gov (United States)

    Burkhart, Timothy A; Brydges, Evan; Stefanczyk, Jennifer; Andrews, David M

    2017-04-01

    The occurrence of distal upper extremity injuries resulting from forward falls (approximately 165,000 per year) has remained relatively constant for over 20years. Previous work has provided valuable insight into fall arrest strategies, but only symmetric falls in body postures that do not represent actual fall scenarios closely have been evaluated. This study quantified the effect of asymmetric loading and body postures on distal upper extremity response to simulated forward falls. Twenty participants were suspended from the Propelled Upper Limb fall ARest Impact System (PULARIS) in different torso and leg postures relative to the ground and to the sagittal plane (0°, 30° and 45°). When released from PULARIS (hands 10cm above surface, velocity 1m/s), participants landed on two force platforms, one for each hand. Right forearm impact response was measured with distal (radial styloid) and proximal (olecranon) tri-axial accelerometers and bipolar EMG from seven muscles. Overall, the relative height of the torso and legs had little effect on the forces, or forearm response variables. Muscle activation patterns consistently increased from the start to the peak activation levels after impact for all muscles, followed by a rapid decline after peak. The impact forces and accelerations suggest that the distal upper extremity is loaded more medial-laterally during asymmetric falls than symmetric falls. Altering the direction of the impact force in this way (volar-dorsal to medial-lateral) may help reduce distal extremity injuries caused when landing occurs symmetrically in the sagittal plane as it has been shown that volar-dorsal forces increase the risk of injury. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Extreme Value Predictions using Monte Carlo Simulations with Artificially Increased Load Spectrum

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2011-01-01

    In the analysis of structures subjected to stationary stochastic load processes the mean out-crossing rate plays an important role as it can be used to determine the extreme value distribution of any response, usually assuming that the sequence of mean out-crossings can be modelled as a Poisson...... be scaled down to its actual value. In the present paper the usefulness of this approach is investigated, considering problems related to wave loads on marine structures. Here the load scale parameter is conveniently taken as the square of the significant wave height....... be found using the First Order Reliability Method (FORM). The FORM analysis also shows that the reliability index is strictly inversely proportional to the square root of the magnitude of the load spectrum, irrespectively of the non-linearity in the system. However, the FORM analysis only gives...

  6. Evaluation of the HadGEM3-A simulations in view of detection and attribution of human influence on extreme events in Europe

    Science.gov (United States)

    Vautard, Robert; Christidis, Nikolaos; Ciavarella, Andrew; Alvarez-Castro, Carmen; Bellprat, Omar; Christiansen, Bo; Colfescu, Ioana; Cowan, Tim; Doblas-Reyes, Francisco; Eden, Jonathan; Hauser, Mathias; Hegerl, Gabriele; Hempelmann, Nils; Klehmet, Katharina; Lott, Fraser; Nangini, Cathy; Orth, René; Radanovics, Sabine; Seneviratne, Sonia I.; van Oldenborgh, Geert Jan; Stott, Peter; Tett, Simon; Wilcox, Laura; Yiou, Pascal

    2018-04-01

    A detailed analysis is carried out to assess the HadGEM3-A global atmospheric model skill in simulating extreme temperatures, precipitation and storm surges in Europe in the view of their attribution to human influence. The analysis is performed based on an ensemble of 15 atmospheric simulations forced with observed sea surface temperature of the 54 year period 1960-2013. These simulations, together with dual simulations without human influence in the forcing, are intended to be used in weather and climate event attribution. The analysis investigates the main processes leading to extreme events, including atmospheric circulation patterns, their links with temperature extremes, land-atmosphere and troposphere-stratosphere interactions. It also compares observed and simulated variability, trends and generalized extreme value theory parameters for temperature and precipitation. One of the most striking findings is the ability of the model to capture North-Atlantic atmospheric weather regimes as obtained from a cluster analysis of sea level pressure fields. The model also reproduces the main observed weather patterns responsible for temperature and precipitation extreme events. However, biases are found in many physical processes. Slightly excessive drying may be the cause of an overestimated summer interannual variability and too intense heat waves, especially in central/northern Europe. However, this does not seem to hinder proper simulation of summer temperature trends. Cold extremes appear well simulated, as well as the underlying blocking frequency and stratosphere-troposphere interactions. Extreme precipitation amounts are overestimated and too variable. The atmospheric conditions leading to storm surges were also examined in the Baltics region. There, simulated weather conditions appear not to be leading to strong enough storm surges, but winds were found in very good agreement with reanalyses. The performance in reproducing atmospheric weather patterns

  7. Orographic precipitation at global and regional scales: Observational uncertainty and evaluation of 25-km global model simulations

    Science.gov (United States)

    Schiemann, Reinhard; Roberts, Charles J.; Bush, Stephanie; Demory, Marie-Estelle; Strachan, Jane; Vidale, Pier Luigi; Mizielinski, Matthew S.; Roberts, Malcolm J.

    2015-04-01

    Precipitation over land exhibits a high degree of variability due to the complex interaction of the precipitation generating atmospheric processes with coastlines, the heterogeneous land surface, and orography. Global general circulation models (GCMs) have traditionally had very limited ability to capture this variability on the mesoscale (here ~50-500 km) due to their low resolution. This has changed with recent investments in resolution and ensembles of multidecadal climate simulations of atmospheric GCMs (AGCMs) with ~25 km grid spacing are becoming increasingly available. Here, we evaluate the mesoscale precipitation distribution in one such set of simulations obtained in the UPSCALE (UK on PrACE - weather-resolving Simulations of Climate for globAL Environmental risk) modelling campaign with the HadGEM-GA3 AGCM. Increased model resolution also poses new challenges to the observational datasets used to evaluate models. Global gridded data products such as those provided by the Global Precipitation Climatology Project (GPCP) are invaluable for assessing large-scale features of the precipitation distribution but may not sufficiently resolve mesoscale structures. In the absence of independent estimates, the intercomparison of different observational datasets may be the only way to get some insight into the uncertainties associated with these observations. Here, we focus on mid-latitude continental regions where observations based on higher-density gauge networks are available in addition to the global data sets: Europe/the Alps, South and East Asia, and the continental US. The ability of GCMs to represent mesoscale variability is of interest in its own right, as climate information on this scale is required by impact studies. An additional motivation for the research proposed here arises from continuing efforts to quantify the components of the global radiation budget and water cycle. Recent estimates based on radiation measurements suggest that the global mean

  8. Sensitivity analysis of local uncertainties in large break loss-of-coolant accident (LB-LOCA) thermo-mechanical simulations

    Energy Technology Data Exchange (ETDEWEB)

    Arkoma, Asko, E-mail: asko.arkoma@vtt.fi; Ikonen, Timo

    2016-08-15

    Highlights: • A sensitivity analysis using the data from EPR LB-LOCA simulations is done. • A procedure to analyze such complex data is outlined. • Both visual and quantitative methods are used. • Input factors related to core design are identified as most significant. - Abstract: In this paper, a sensitivity analysis for the data originating from a large break loss-of-coolant accident (LB-LOCA) analysis of an EPR-type nuclear power plant is presented. In the preceding LOCA analysis, the number of failing fuel rods in the accident was established (Arkoma et al., 2015). However, the underlying causes for rod failures were not addressed. It is essential to bring out which input parameters and boundary conditions have significance to the outcome of the analysis, i.e. the ballooning and burst of the rods. Due to complexity of the existing data, the first part of the analysis consists of defining the relevant input parameters for the sensitivity analysis. Then, selected sensitivity measures are calculated between the chosen input and output parameters. The ultimate goal is to develop a systematic procedure for the sensitivity analysis of statistical LOCA simulation that takes into account the various sources of uncertainties in the calculation chain. In the current analysis, the most relevant parameters with respect to the cladding integrity are the decay heat power during the transient, the thermal hydraulic conditions in the rod’s location in reactor, and the steady-state irradiation history of the rod. Meanwhile, the tolerances in fuel manufacturing parameters were found to have negligible effect on cladding deformation.

  9. Simulating extreme low-discharge events for the Rhine using a stochastic model

    Science.gov (United States)

    Macian-Sorribes, Hector; Mens, Marjolein; Schasfoort, Femke; Diermanse, Ferdinand; Pulido-Velazquez, Manuel

    2017-04-01

    The specific features of hydrological droughts make them more difficult to be analysed than other water-related phenomena: longer time scales (months to several years) so less historical events are available, and the drought severity and associate damage depends on a combination of variables with no clear prevalence (e.g., total water deficit, maximum deficit and duration). As part of drought risk analysis, which aims to provide insight into the variability of hydrological conditions and associated socio-economic impacts, long synthetic time series should therefore be developed. In this contribution, we increase the length of the available inflow time series using stochastic autoregressive modelling. This enhancement could improve the characterization of the extreme range and can define extreme droughts with similar periods of return but different patterns that can lead to distinctly different damages. The methodology consists of: 1) fitting an autoregressive model (AR, ARMA…) to the available records; 2) generating extended time series (thousands of years); 3) performing a frequency analysis with different characteristic variables (total, deficit, maximum deficit and so on); and 4) selecting extreme drought events associated with different characteristic variables and return periods. The methodology was applied to the Rhine river discharge at location Lobith, where the Rhine enters The Netherlands. A monthly ARMA(1,1) autoregressive model with seasonally varying parameters was fitted and successfully validated to the historical records available since year 1901. The maximum monthly deficit with respect to a threshold value of 1800 m3/s and the average discharge for a given time span in m3/s were chosen as indicators to identify drought periods. A synthetic series of 10,000 years of discharges was generated using the validated ARMA model. Two time spans were considered in the analysis: the whole calendar year and the half-year period between April and September

  10. Uncertainties of Large-Scale Forcing Caused by Surface Turbulence Flux Measurements and the Impacts on Cloud Simulations at the ARM SGP Site

    Science.gov (United States)

    Tang, S.; Xie, S.; Tang, Q.; Zhang, Y.

    2017-12-01

    Two types of instruments, the eddy correlation flux measurement system (ECOR) and the energy balance Bowen ratio system (EBBR), are used at the Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site to measure surface latent and sensible fluxes. ECOR and EBBR typically sample different land surface types, and the domain-mean surface fluxes derived from ECOR and EBBR are not always consistent. The uncertainties of the surface fluxes will have impacts on the derived large-scale forcing data and further affect the simulations of single-column models (SCM), cloud-resolving models (CRM) and large-eddy simulation models (LES), especially for the shallow-cumulus clouds which are mainly driven by surface forcing. This study aims to quantify the uncertainties of the large-scale forcing caused by surface turbulence flux measurements and investigate the impacts on cloud simulations using long-term observations from the ARM SGP site.

  11. ARIANNE. Analytical uncertainties. Simulation of influential factors in the inventory of the final web cam; ARIANNE. Incertidumbres analiticas. Factores de simulacion influyentes en el inventario de la isotopia final

    Energy Technology Data Exchange (ETDEWEB)

    Morales Prieto, M.; Ortega Saiz, P.

    2011-07-01

    Analysis of analytical uncertainties of the methodology of simulation of processes for obtaining isotopic ending inventory of spent fuel, the ARIANE experiment explores the part of simulation of burning.

  12. Development and comparison of Bayesian modularization method in uncertainty assessment of hydrological models

    Science.gov (United States)

    Li, L.; Xu, C.-Y.; Engeland, K.

    2012-04-01

    With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD

  13. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    International Nuclear Information System (INIS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-01-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has

  14. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach

  15. Simulating extreme environments: Ergonomic evaluation of Chinese pilot performance and heat stress tolerance.

    Science.gov (United States)

    Li, Jing; Tian, Yinsheng; Ding, Li; Zou, Huijuan; Ren, Zhaosheng; Shi, Liyong; Feathers, David; Wang, Ning

    2015-06-05

    High-temperatures in the cockpit environment can adversely influence pilot behavior and performance. To investigate the impact of high thermal environments on Chinese pilot performance in a simulated cockpit environment. Ten subjects volunteered to participate in the tests under 40°C and 45°C high-temperature simulations in an environmentally controlled chamber. Measures such as grip strength, perception, dexterity, somatic sense reaction, and analytical reasoning were taken. The results were compared to the Combined Index of Heat Stress (CIHS). CIHS exceeded the heat stress safety limit after 45 min under 40°C, grip strength decreased by 12% and somatic perception became 2.89 times larger than the initial value. In the case of 45°C, CIHS exceeded the safety limit after only 20 min, while the grip strength decreased just by 3.2% and somatic perception increased to 4.36 times larger than the initial value. Reaction and finger dexterity were not statistically different from baseline measurements, but the error rate of analytical reasoning test rose remarkably. Somatic perception was the most sensitive index to high-temperature, followed by grip strength. Results of this paper may help to improve environmental control design of new fighter cockpit and for pilot physiology and cockpit environment ergonomics research for Chinese pilots.

  16. Future Precipitation Extremes in China Under Climate Change and Their Possible Mechanisms by Regional Climate Model and Earth System Model Simulations

    Science.gov (United States)

    Qin, P.; Xie, Z.

    2017-12-01

    Future precipitation extremes in China for the mid and end of 21st century were detected with six simulations using the regional climate model RegCM4 (RCM) and 17 global climate models (GCM) participated in the coupled Model Intercomparison Project Phase 5 (CMIP5). Prior to understanding the future changes in precipitation extremes, we overviewed the performance of precipitation extremes simulated by the CMIP5s and RCMs, and found both CMIP5s and RCMs could capture the temporal and spatial pattern of the historical precipitation extremes in China. In the mid-future period 2039-2058 (MF) and far-future 2079-2098 (FF), more wet precipitation extremes will occur in most area of China relative to the present period 1982-2001 (RF). We quantified the rates of the changes in precipitation extremes in China with the changes in air surface temperature (T2M) for the MF and FF period. Changes in precipitation extremes R95p were found around 5% K-1 for the MF period and 10% K-1 for the FF period, and changes in maximum 5 day precipitation (Rx5day) were detected around 4% K-1 for the MF period and 7% K-1 for the FF period, respectively. Finally, the possible physical mechanisms behind the changes in precipitation extremes in China were also discussed through the changes in specific humidity and vertical wind.

  17. Robots integrated with virtual reality simulations for customized motor training in a person with upper extremity hemiparesis: a case report

    Science.gov (United States)

    Fluet, Gerard G.; Merians, Alma S.; Qiu, Qinyin; Lafond, Ian; Saleh, Soha; Ruano, Viviana; Delmonico, Andrea R.; Adamovich, Sergei V.

    2014-01-01

    Background and Purpose A majority of studies examining repetitive task practice facilitated by robots for the treatment of upper extremity paresis utilize standardized protocols applied to large groups. Others utilize interventions tailored to patients but don't describe the clinical decision making process utilized to develop and modify interventions. This case report will describe a robot-based intervention customized to match the goals and clinical presentation of a gentleman with upper extremity hemiparesis secondary to stroke. Methods PM is an 85 year-old man with left hemiparesis secondary to an intracerebral hemorrhage five years prior to examination. Outcomes were measured before and after a one month period of home therapy and after a one month robotic intervention. The intervention was designed to address specific impairments identified during his PT examination. When necessary, activities were modified based on the patient's response to his first week of treatment. Outcomes PM trained twelve sessions using six virtually simulated activities. Modifications to original configurations of these activities resulted in performance improvements in five of these activities. PM demonstrated a 35 second improvement in Jebsen Test of Hand Function time and a 44 second improvement in Wolf Motor Function Test time subsequent to the robotic training intervention. Reaching kinematics, 24 hour activity measurement and the Hand and Activities of Daily Living scales of the Stroke Impact Scale all improved as well. Discussion A customized program of robotically facilitated rehabilitation resulted in large short-term improvements in several measurements of upper extremity function in a patient with chronic hemiparesis. PMID:22592063

  18. Effect of linear and non-linear blade modelling techniques on simulated fatigue and extreme loads using Bladed

    Science.gov (United States)

    Beardsell, Alec; Collier, William; Han, Tao

    2016-09-01

    There is a trend in the wind industry towards ever larger and more flexible turbine blades. Blade tip deflections in modern blades now commonly exceed 10% of blade length. Historically, the dynamic response of wind turbine blades has been analysed using linear models of blade deflection which include the assumption of small deflections. For modern flexible blades, this assumption is becoming less valid. In order to continue to simulate dynamic turbine performance accurately, routine use of non-linear models of blade deflection may be required. This can be achieved by representing the blade as a connected series of individual flexible linear bodies - referred to in this paper as the multi-part approach. In this paper, Bladed is used to compare load predictions using single-part and multi-part blade models for several turbines. The study examines the impact on fatigue and extreme loads and blade deflection through reduced sets of load calculations based on IEC 61400-1 ed. 3. Damage equivalent load changes of up to 16% and extreme load changes of up to 29% are observed at some turbine load locations. It is found that there is no general pattern in the loading differences observed between single-part and multi-part blade models. Rather, changes in fatigue and extreme loads with a multi-part blade model depend on the characteristics of the individual turbine and blade. Key underlying causes of damage equivalent load change are identified as differences in edgewise- torsional coupling between the multi-part and single-part models, and increased edgewise rotor mode damping in the multi-part model. Similarly, a causal link is identified between torsional blade dynamics and changes in ultimate load results.

  19. Calculating Remote Sensing Reflectance Uncertainties Using an Instrument Model Propagated Through Atmospheric Correction via Monte Carlo Simulations

    Science.gov (United States)

    Karakoylu, E.; Franz, B.

    2016-01-01

    First attempt at quantifying uncertainties in ocean remote sensing reflectance satellite measurements. Based on 1000 iterations of Monte Carlo. Data source is a SeaWiFS 4-day composite, 2003. The uncertainty is for remote sensing reflectance (Rrs) at 443 nm.

  20. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  1. Simulation and Measurement of Through-the-Earth, Extremely Low-Frequency Signals Using Copper-Clad Steel Ground Rods.

    Science.gov (United States)

    Damiano, Nicholas William; Yan, Lincan; Whisner, Bruce; Zhou, Chenming

    2017-01-01

    The underground mining environment can greatly affect radio signal propagation. Understanding how the earth affects signal propagation is a key to evaluating communications systems used during a mine emergency. One type of communication system is through-the-earth, which can utilize extremely low frequencies (ELF). This paper presents the simulation and measurement results of recent National Institute for Occupational Safety and Health (NIOSH) research aimed at investigating current injection at ELF, and in particular, ground contact impedance. Measurements were taken at an outside surface testing location. The results obtained from modeling and measurement are characterized by electrode impedance, and the voltage received between two distant electrodes. This paper concludes with a discussion of design considerations found to affect low-frequency communication systems utilizing ground rods to inject a current into the earth.

  2. Future changes in summer mean and extreme precipitation frequency in Japan by d4PDF regional climate simulations

    Science.gov (United States)

    Okada, Y.; Ishii, M.; Endo, H.; Kawase, H.; Sasaki, H.; Takayabu, I.; Watanabe, S.; Fujita, M.; Sugimoto, S.; Kawazoe, S.

    2017-12-01

    Precipitation in summer plays a vital role in sustaining life across East Asia, but the heavy rain that is often generated during this period can also cause serious damage. Developing a better understanding of the features and occurrence frequency of this heavy rain is an important element of disaster prevention. We investigated future changes in summer mean and extreme precipitation frequency in Japan using large ensemble dataset which simulated by the Non-Hydrostatic Regional Climate Model with a horizontal resolution of 20km (NHRCM20). This dataset called database for Policy Decision making for Future climate changes (d4PDF), which is intended to be utilized for the impact assessment studies and adaptation planning to global warming. The future climate experiments assume the global mean surface air temperature rise 2K and 4K from the pre-industrial period. We investigated using this dataset future changes of precipitation in summer over the Japanese archipelago based on observational locations. For mean precipitation in the present-day climate, the bias of the rainfall for each month is within 25% even considering all members (30 members). The bias at each location is found to increase by over 50% on the Pacific Ocean side of eastern part of Japan and interior locations of western part of Japan. The result in western part of Japan depends on the effect of the elevations in this model. The future changes in mean precipitation show a contrast between northern and southern Japan, with the north showing a slight increase but the south a decrease. The future changes in the frequency of extreme precipitation in the national average of Japan increase at 2K and 4K simulations compared with the present-day climate, respectively. The authors were supported by the Social Implementation Program on Climate Change Adaptation Technology (SI-CAT), the Ministry of Education, Culture, Sports, Science, and Technology (MEXT), Japan.

  3. Microwave tomography of extremities: 2. Functional fused imaging of flow reduction and simulated compartment syndrome

    International Nuclear Information System (INIS)

    Semenov, Serguei; Nair, Bindu; Kellam, James; Williams, Thomas; Quinn, Michael; Sizov, Yuri; Nazarov, Alexei; Pavlovsky, Andrey

    2011-01-01

    Medical imaging has recently expanded into the dual- or multi-modality fusion of anatomical and functional imaging modalities. This significantly improves the diagnostic power while simultaneously increasing the cost of already expensive medical devices or investigations and decreasing their mobility. We are introducing a novel imaging concept of four-dimensional (4D) microwave tomographic (MWT) functional imaging: three dimensional (3D) in the spatial domain plus one dimensional (1D) in the time, functional dynamic domain. Instead of a fusion of images obtained by different imaging modalities, 4D MWT fuses absolute anatomical images with dynamic, differential images of the same imaging technology. The approach was successively validated in animal experiments with short-term arterial flow reduction and a simulated compartment syndrome in an initial simplified experimental setting using a dedicated MWT system. The presented fused images are not perfect as MWT is a novel imaging modality at its early stage of the development and ways of reading reconstructed MWT images need to be further studied and understood. However, the reconstructed fused images present clear evidence that microwave tomography is an emerging imaging modality with great potentials for functional imaging.

  4. Simulated stability tests of a small articulated tractor designed for extreme-sloped vineyards

    Directory of Open Access Journals (Sweden)

    F. Mazzetto

    2013-09-01

    Full Text Available A new reversible wheeled articulated tractor, designed to work in terraced vineyards trained with “pergola” system, common in mountain areas, is here described in its latest version and analysed through numerical simulations. This tractor has small dimensions, necessary to operate in that environment, and its central articulation has two rotational degrees-offreedom. The described features are surely strong design points but could be critical for vehicle’s stability, as affecting the supporting base’s dimensions and shape. Therefore, the tractor was equipped with a new automatic safety system: a self-locking articulation activated by contact sensors on the wheels. This device makes the vehicle partially-rigid in case of lateral unbalancing, so that rollover can happen only by overcoming the whole vehicle mass. A mathematical description of vehicle-ground interactions was implemented to deeply inquiry the tractor behaviour in different configurations (straight, angled at increasing values of ground slope; roll and pitch stability indexes were then computed and used for comparisons with conventional tractors. Thanks to the low centre-of-gravity, the resulting rollover angle with the vehicle in straight configuration is promising (43.8°→96%: it is greater than the maximum lateral (20°→36% and frontal (38°→78% slope angle ever recorded on terraced vineyards. The same rollover angle is lower when the tractor turns.

  5. Flooding Simulation of Extreme Event on Barnegat Bay by High-Resolution Two Dimensional Hydrodynamic Model

    Science.gov (United States)

    Wang, Y.; Ramaswamy, V.; Saleh, F.

    2017-12-01

    Barnegat Bay located on the east coast of New Jersey, United States and is separated from the Atlantic Ocean by the narrow Barnegat Peninsula which acts as a barrier island. The bay is fed by several rivers which empty through small estuaries along the inner shore. In terms of vulnerability from flooding, the Barnegat Peninsula is under the influence of both coastal storm surge and riverine flooding. Barnegat Bay was hit by Hurricane Sandy causing flood damages with extensive cross-island flow at many streets perpendicular to the shoreline. The objective of this work is to identify and quantify the sources of flooding using a two dimensional inland hydrodynamic model. The hydrodynamic model was forced by three observed coastal boundary conditions, and one hydrologic boundary condition from United States Geological Survey (USGS). The model reliability was evaluated with both FEMA spatial flooding extend and USGS High water marks. Simulated flooding extent showed good agreement with the reanalysis spatial inundation extents. Results offered important perspectives on the flow of the water into the bay, the velocity and the depth of the inundated areas. Using such information can enable emergency managers and decision makers identify evacuation and deploy flood defenses.

  6. Assessing uncertainty and sensitivity of model parameterizations and parameters in WRF affecting simulated surface fluxes and land-atmosphere coupling over the Amazon region

    Science.gov (United States)

    Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.

    2016-12-01

    This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for

  7. Assessment of WRF microphysics schemes to simulate extreme precipitation events from the perspective of GMI radiative signatures

    Science.gov (United States)

    Choi, Y.; Shin, D. B.; Joh, M.

    2015-12-01

    Numerical simulations of precipitation depend to a large degree on the assumed cloud microphysics schemes representing the formation, growth and fallout of cloud droplets and ice crystals. Recent studies show that assumed cloud microphysics play a major role not only in forecasting precipitation, especially in cases of extreme precipitation events, but also in the quality of the passive microwave rainfall estimation. Evaluations of the various Weather Research Forecasting (WRF) model microphysics schemes in this study are based on a method that was originally developed to construct the a-priori databases of precipitation profiles and associated brightness temperatures (TBs) for precipitation retrievals. This methodology generates three-dimensional (3D) precipitation fields by matching the GPM dual frequency radar (DPR) reflectivity profiles with those calculated from cloud resolving model (CRM)-derived hydrometeor profiles. The method eventually provides 3D simulated precipitation fields over the DPR scan swaths. That is, atmospheric and hydrometeor profiles can be generated at each DPR pixel based on CRM and DPR reflectivity profiles. The generated raining systems over DPR observation fields can be applied to any radiometers that are unaccompanied with a radar for microwave radiative calculation with consideration of each sensor's channel and field of view. Assessment of the WRF model microphysics schemes for several typhoon cases in terms of emission and scattering signals of GMI will be discussed.

  8. Informal uncertainty analysis (GLUE of continuous flow simulation in a hybrid sewer system with infiltration inflow – consistency of containment ratios in calibration and validation?

    Directory of Open Access Journals (Sweden)

    A. Breinholt

    2013-10-01

    Full Text Available Monitoring of flows in sewer systems is increasingly applied to calibrate urban drainage models used for long-term simulation. However, most often models are calibrated without considering the uncertainties. The generalized likelihood uncertainty estimation (GLUE methodology is here applied to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction limits without statistical coherence and consistency and for the subjectivity in the choice of a threshold value to distinguish "behavioural" from "non-behavioural" parameter sets. In this paper we examine how well the GLUE methodology performs when the behavioural parameter sets deduced from a calibration period are applied to generate prediction bounds in validation periods. By retaining an increasing number of parameter sets we aim at obtaining consistency between the GLUE generated 90% prediction limits and the actual containment ratio (CR in calibration. Due to the large uncertainties related to spatio-temporal rain variability during heavy convective rain events, flow measurement errors, possible model deficiencies as well as epistemic uncertainties, it was not possible to obtain an overall CR of more than 80%. However, the GLUE generated prediction limits still proved rather consistent, since the overall CRs obtained in calibration corresponded well with the overall CRs obtained in validation periods for all proportions of retained parameter sets evaluated. When focusing on wet and dry weather periods separately, some inconsistencies were however found between calibration and validation and we address here some of the reasons why we should not expect the coverage of the prediction limits to be identical in calibration and validation periods in real

  9. Monte Carlo simulation for uncertainty estimation on structural data in implicit 3-D geological modeling, a guide for disturbance distribution selection and parameterization

    Science.gov (United States)

    Pakyuz-Charrier, Evren; Lindsay, Mark; Ogarko, Vitaliy; Giraud, Jeremie; Jessell, Mark

    2018-04-01

    Three-dimensional (3-D) geological structural modeling aims to determine geological information in a 3-D space using structural data (foliations and interfaces) and topological rules as inputs. This is necessary in any project in which the properties of the subsurface matters; they express our understanding of geometries in depth. For that reason, 3-D geological models have a wide range of practical applications including but not restricted to civil engineering, the oil and gas industry, the mining industry, and water management. These models, however, are fraught with uncertainties originating from the inherent flaws of the modeling engines (working hypotheses, interpolator's parameterization) and the inherent lack of knowledge in areas where there are no observations combined with input uncertainty (observational, conceptual and technical errors). Because 3-D geological models are often used for impactful decision-making it is critical that all 3-D geological models provide accurate estimates of uncertainty. This paper's focus is set on the effect of structural input data measurement uncertainty propagation in implicit 3-D geological modeling. This aim is achieved using Monte Carlo simulation for uncertainty estimation (MCUE), a stochastic method which samples from predefined disturbance probability distributions that represent the uncertainty of the original input data set. MCUE is used to produce hundreds to thousands of altered unique data sets. The altered data sets are used as inputs to produce a range of plausible 3-D models. The plausible models are then combined into a single probabilistic model as a means to propagate uncertainty from the input data to the final model. In this paper, several improved methods for MCUE are proposed. The methods pertain to distribution selection for input uncertainty, sample analysis and statistical consistency of the sampled distribution. Pole vector sampling is proposed as a more rigorous alternative than dip vector

  10. Monte Carlo simulation for uncertainty estimation on structural data in implicit 3-D geological modeling, a guide for disturbance distribution selection and parameterization

    Directory of Open Access Journals (Sweden)

    E. Pakyuz-Charrier

    2018-04-01

    Full Text Available Three-dimensional (3-D geological structural modeling aims to determine geological information in a 3-D space using structural data (foliations and interfaces and topological rules as inputs. This is necessary in any project in which the properties of the subsurface matters; they express our understanding of geometries in depth. For that reason, 3-D geological models have a wide range of practical applications including but not restricted to civil engineering, the oil and gas industry, the mining industry, and water management. These models, however, are fraught with uncertainties originating from the inherent flaws of the modeling engines (working hypotheses, interpolator's parameterization and the inherent lack of knowledge in areas where there are no observations combined with input uncertainty (observational, conceptual and technical errors. Because 3-D geological models are often used for impactful decision-making it is critical that all 3-D geological models provide accurate estimates of uncertainty. This paper's focus is set on the effect of structural input data measurement uncertainty propagation in implicit 3-D geological modeling. This aim is achieved using Monte Carlo simulation for uncertainty estimation (MCUE, a stochastic method which samples from predefined disturbance probability distributions that represent the uncertainty of the original input data set. MCUE is used to produce hundreds to thousands of altered unique data sets. The altered data sets are used as inputs to produce a range of plausible 3-D models. The plausible models are then combined into a single probabilistic model as a means to propagate uncertainty from the input data to the final model. In this paper, several improved methods for MCUE are proposed. The methods pertain to distribution selection for input uncertainty, sample analysis and statistical consistency of the sampled distribution. Pole vector sampling is proposed as a more rigorous alternative than

  11. Photometric Uncertainties

    Science.gov (United States)

    Zou, Xiao-Duan; Li, Jian-Yang; Clark, Beth Ellen; Golish, Dathon

    2018-01-01

    The OSIRIS-REx spacecraft, launched in September, 2016, will study the asteroid Bennu and return a sample from its surface to Earth in 2023. Bennu is a near-Earth carbonaceous asteroid which will provide insight into the formation and evolution of the solar system. OSIRIS-REx will first approach Bennu in August 2018 and will study the asteroid for approximately two years before sampling. OSIRIS-REx will develop its photometric model (including Lommel-Seelinger, ROLO, McEwen, Minnaert and Akimov) of Bennu with OCAM and OVIRS during the Detailed Survey mission phase. The model developed during this phase will be used to photometrically correct the OCAM and OVIRS data.Here we present the analysis of the error for the photometric corrections. Based on our testing data sets, we find:1. The model uncertainties is only correct when we use the covariance matrix to calculate, because the parameters are highly correlated.2. No evidence of domination of any parameter in each model.3. And both model error and the data error contribute to the final correction error comparably.4. We tested the uncertainty module on fake and real data sets, and find that model performance depends on the data coverage and data quality. These tests gave us a better understanding of how different model behave in different case.5. L-S model is more reliable than others. Maybe because the simulated data are based on L-S model. However, the test on real data (SPDIF) does show slight advantage of L-S, too. ROLO is not reliable to use when calculating bond albedo. The uncertainty of McEwen model is big in most cases. Akimov performs unphysical on SOPIE 1 data.6. Better use L-S as our default choice, this conclusion is based mainly on our test on SOPIE data and IPDIF.

  12. The Generic Containment SB-LOCA accident simulation: Comparison of the parameter uncertainties and user-effect

    International Nuclear Information System (INIS)

    Povilaitis, Mantas; Kelm, Stephan; Urbonavičius, Egidijus

    2017-01-01

    Highlights: • Uncertainty and sensitivity analysis for the Generic Containment severe accident. • Comparison of the analysis results with the uncertainties based in the user effect. • Demonstration of the similar importance of both the reducing the user effect and input uncertainties. - Abstract: Uncertainties in safety assessment of the nuclear power plants using computer codes come from several sources: choice of computer code, user effect (a strong impact of user choices on the simulation’s outcome) and uncertainty of various physical parameters. The “Generic Containment” activity was performed in the frames of the EU-FP7 project SARNET2 to investigate the influence of user effect and computer code choice on the results on the nuclear power plant scale. During this activity, a Generic Containment nodalisation was developed and used for exercise by the participants applying various computer codes. Even though the model of the Generic Containment and the transient scenario were precisely and uniquely defined, considerably different results were obtained not only among different codes but also among participants using the same code, showing significant influence of the user effect. This paper present analysis, which is an extension of the “Generic Containment” benchmark and investigates the effect of input parameter’s uncertainties in comparison to the user effect. Calculations were performed using the computer code ASTEC, the uncertainty and sensitivity of the results were estimated using GRS method and tool SUSA. The results of the present analysis show, that while there are differences between the uncertainty bands of the parameters, in general the deviation bands caused by parameters’ uncertainty and the user effect are comparable and of the same order. The properties of concrete and the surface areas may have more influence on containment pressure than the user effect and choice of computer code as identified in the SARNET2 Generic

  13. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  14. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  15. Quantitative Analysis of Variability and Uncertainty in Environmental Data and Models. Volume 1. Theory and Methodology Based Upon Bootstrap Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H. Christopher [North Carolina State University, Raleigh, NC (United States); Rhodes, David S. [North Carolina State University, Raleigh, NC (United States)

    1999-04-30

    This is Volume 1 of a two-volume set of reports describing work conducted at North Carolina State University sponsored by Grant Number DE-FG05-95ER30250 by the U.S. Department of Energy. The title of the project is “Quantitative Analysis of Variability and Uncertainty in Acid Rain Assessments.” The work conducted under sponsorship of this grant pertains primarily to two main topics: (1) development of new methods for quantitative analysis of variability and uncertainty applicable to any type of model; and (2) analysis of variability and uncertainty in the performance, emissions, and cost of electric power plant combustion-based NOx control technologies. These two main topics are reported separately in Volumes 1 and 2.

  16. 360°-View of Quantum Theory and Ab Initio Simulation at Extreme Conditions: 2014 Sanibel Symposium

    International Nuclear Information System (INIS)

    Cheng, Hai-Ping

    2016-01-01

    The Sanibel Symposium 2014 was held February 16-21, 2014, at the King and Prince, St. Simons Island, GA. It was successful in bringing condensed-matter physicists and quantum chemists together productively to drive the emergence of those specialties. The Symposium had a significant role in preparing a whole generation of quantum theorists. The 54th Sanibel meeting looked to the future in two ways. We had 360°-View sessions to honor the exceptional contributions of Rodney Bartlett (70), Bill Butler (70), Yngve Öhrn (80), Fritz Schaefer (70), and Malcolm Stocks (70). The work of these five has greatly impacted several generations of quantum chemists and condensed matter physicists. The ''360°'' is the sum of their ages. More significantly, it symbolizes a panoramic view of critical developments and accomplishments in theoretical and computational chemistry and physics oriented toward the future. Thus, two of the eight 360°-View sessions focused specifically on younger scientists. The 360°-View program was the major component of the 2014 Sanibel meeting. Another four sessions included a sub-symposium on ab initio Simulations at Extreme Conditions, with focus on getting past the barriers of present-day Born-Oppenheimer molecular dynamics by advances in finite-temperature density functional theory, orbital-free DFT, and new all-numerical approaches.

  17. Geologic storage of carbon dioxide and enhanced oil recovery. I. Uncertainty quantification employing a streamline based proxy for reservoir flow simulation

    International Nuclear Information System (INIS)

    Kovscek, A.R.; Wang, Y.

    2005-01-01

    Carbon dioxide (CO 2 ) is already injected into a limited class of reservoirs for oil recovery purposes; however, the engineering design question for simultaneous oil recovery and storage of anthropogenic CO 2 is significantly different from that of oil recovery alone. Currently, the volumes of CO 2 injected solely for oil recovery are minimized due to the purchase cost of CO 2 . If and when CO 2 emissions to the atmosphere are managed, it will be necessary to maximize simultaneously both economic oil recovery and the volumes of CO 2 emplaced in oil reservoirs. This process is coined 'cooptimization'. This paper proposes a work flow for cooptimization of oil recovery and geologic CO 2 storage. An important component of the work flow is the assessment of uncertainty in predictions of performance. Typical methods for quantifying uncertainty employ exhaustive flow simulation of multiple stochastic realizations of the geologic architecture of a reservoir. Such approaches are computationally intensive and thereby time consuming. An analytic streamline based proxy for full reservoir simulation is proposed and tested. Streamline trajectories represent the three-dimensional velocity field during multiphase flow in porous media and so are useful for quantifying the similarity and differences among various reservoir models. The proxy allows rational selection of a representative subset of equi-probable reservoir models that encompass uncertainty with respect to true reservoir geology. The streamline approach is demonstrated to be thorough and rapid

  18. Errors and uncertainties introduced by a regional climate model in climate impact assessments: example of crop yield simulations in West Africa

    International Nuclear Information System (INIS)

    Ramarohetra, Johanna; Pohl, Benjamin; Sultan, Benjamin

    2015-01-01

    The challenge of estimating the potential impacts of climate change has led to an increasing use of dynamical downscaling to produce fine spatial-scale climate projections for impact assessments. In this work, we analyze if and to what extent the bias in the simulated crop yield can be reduced by using the Weather Research and Forecasting (WRF) regional climate model to downscale ERA-Interim (European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis) rainfall and radiation data. Then, we evaluate the uncertainties resulting from both the choice of the physical parameterizations of the WRF model and its internal variability. Impact assessments were performed at two sites in Sub-Saharan Africa and by using two crop models to simulate Niger pearl millet and Benin maize yields. We find that the use of the WRF model to downscale ERA-Interim climate data generally reduces the bias in the simulated crop yield, yet this reduction in bias strongly depends on the choices in the model setup. Among the physical parameterizations considered, we show that the choice of the land surface model (LSM) is of primary importance. When there is no coupling with a LSM, or when the LSM is too simplistic, the simulated precipitation and then the simulated yield are null, or respectively very low; therefore, coupling with a LSM is necessary. The convective scheme is the second most influential scheme for yield simulation, followed by the shortwave radiation scheme. The uncertainties related to the internal variability of the WRF model are also significant and reach up to 30% of the simulated yields. These results suggest that regional models need to be used more carefully in order to improve the reliability of impact assessments. (letter)

  19. Uncertainties in repository performance from spatial variability of hydraulic conductivities - statistical estimation and stochastic simulation using PROPER

    International Nuclear Information System (INIS)

    Lovius, L.; Norman, S.; Kjellbert, N.

    1990-02-01

    An assessment has been made of the impact of spatial variability on the performance of a KBS-3 type repository. The uncertainties in geohydrologically related performance measures have been investigated using conductivity data from one of the Swedish study sites. The analysis was carried out with the PROPER code and the FSCF10 submodel. (authors)

  20. Developing an Agent-Based Simulation System for Post-Earthquake Operations in Uncertainty Conditions: A Proposed Method for Collaboration among Agents

    Directory of Open Access Journals (Sweden)

    Navid Hooshangi

    2018-01-01

    Full Text Available Agent-based modeling is a promising approach for developing simulation tools for natural hazards in different areas, such as during urban search and rescue (USAR operations. The present study aimed to develop a dynamic agent-based simulation model in post-earthquake USAR operations using geospatial information system and multi agent systems (GIS and MASs, respectively. We also propose an approach for dynamic task allocation and establishing collaboration among agents based on contract net protocol (CNP and interval-based Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS methods, which consider uncertainty in natural hazards information during agents’ decision-making. The decision-making weights were calculated by analytic hierarchy process (AHP. In order to implement the system, earthquake environment was simulated and the damage of the buildings and a number of injuries were calculated in Tehran’s District 3: 23%, 37%, 24% and 16% of buildings were in slight, moderate, extensive and completely vulnerable classes, respectively. The number of injured persons was calculated to be 17,238. Numerical results in 27 scenarios showed that the proposed method is more accurate than the CNP method in the terms of USAR operational time (at least 13% decrease and the number of human fatalities (at least 9% decrease. In interval uncertainty analysis of our proposed simulated system, the lower and upper bounds of uncertain responses are evaluated. The overall results showed that considering uncertainty in task allocation can be a highly advantageous in the disaster environment. Such systems can be used to manage and prepare for natural hazards.

  1. Simulation of climate characteristics and extremes of the Volta Basin using CCLM and RCA regional climate models

    Science.gov (United States)

    Darko, Deborah; Adjei, Kwaku A.; Appiah-Adjei, Emmanuel K.; Odai, Samuel N.; Obuobie, Emmanuel; Asmah, Ruby

    2018-06-01

    The extent to which statistical bias-adjusted outputs of two regional climate models alter the projected change signals for the mean (and extreme) rainfall and temperature over the Volta Basin is evaluated. The outputs from two regional climate models in the Coordinated Regional Climate Downscaling Experiment for Africa (CORDEX-Africa) are bias adjusted using the quantile mapping technique. Annual maxima rainfall and temperature with their 10- and 20-year return values for the present (1981-2010) and future (2051-2080) climates are estimated using extreme value analyses. Moderate extremes are evaluated using extreme indices (viz. percentile-based, duration-based, and intensity-based). Bias adjustment of the original (bias-unadjusted) models improves the reproduction of mean rainfall and temperature for the present climate. However, the bias-adjusted models poorly reproduce the 10- and 20-year return values for rainfall and maximum temperature whereas the extreme indices are reproduced satisfactorily for the present climate. Consequently, projected changes in rainfall and temperature extremes were weak. The bias adjustment results in the reduction of the change signals for the mean rainfall while the mean temperature signals are rather magnified. The projected changes for the original mean climate and extremes are not conserved after bias adjustment with the exception of duration-based extreme indices.

  2. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  3. Addressing uncertainty in adaptation planning for agriculture.

    Science.gov (United States)

    Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R

    2013-05-21

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.

  4. Simulation of temperature extremes in the Tibetan Plateau from CMIP5 models and comparison with gridded observations

    Science.gov (United States)

    You, Qinglong; Jiang, Zhihong; Wang, Dai; Pepin, Nick; Kang, Shichang

    2017-09-01

    Understanding changes in temperature extremes in a warmer climate is of great importance for society and for ecosystem functioning due to potentially severe impacts of such extreme events. In this study, temperature extremes defined by the Expert Team on Climate Change Detection and Indices (ETCCDI) from CMIP5 models are evaluated by comparison with homogenized gridded observations at 0.5° resolution across the Tibetan Plateau (TP) for 1961-2005. Using statistical metrics, the models have been ranked in terms of their ability to reproduce similar patterns in extreme events to the observations. Four CMIP5 models have good performance (BNU-ESM, HadGEM2-ES, CCSM4, CanESM2) and are used to create an optimal model ensemble (OME). Most temperature extreme indices in the OME are closer to the observations than in an ensemble using all models. Best performance is given for threshold temperature indices and extreme/absolute value indices are slightly less well modelled. Thus the choice of model in the OME seems to have more influences on temperature extreme indices based on thresholds. There is no significant correlation between elevation and modelled bias of the extreme indices for both the optimal/all model ensembles. Furthermore, the minimum temperature (Tmin) is significanlty positive correlations with the longwave radiation and cloud variables, respectively, but the Tmax fails to find the correlation with the shortwave radiation and cloud variables. This suggests that the cloud-radiation differences influence the Tmin in each CMIP5 model to some extent, and result in the temperature extremes based on Tmin.

  5. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  6. LDRD Final Report: Capabilities for Uncertainty in Predictive Science.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric Todd; Eldred, Michael S; Salinger, Andrew G.; Webster, Clayton G.

    2008-10-01

    Predictive simulation of systems comprised of numerous interconnected, tightly coupled com-ponents promises to help solve many problems of scientific and national interest. Howeverpredictive simulation of such systems is extremely challenging due to the coupling of adiverse set of physical and biological length and time scales. This report investigates un-certainty quantification methods for such systems that attempt to exploit their structure togain computational efficiency. The traditional layering of uncertainty quantification aroundnonlinear solution processes is inverted to allow for heterogeneous uncertainty quantificationmethods to be applied to each component in a coupled system. Moreover this approachallows stochastic dimension reduction techniques to be applied at each coupling interface.The mathematical feasibility of these ideas is investigated in this report, and mathematicalformulations for the resulting stochastically coupled nonlinear systems are developed.3

  7. Reducing uncertainty of Monte Carlo estimated fatigue damage in offshore wind turbines using FORM

    DEFF Research Database (Denmark)

    H. Horn, Jan-Tore; Jensen, Jørgen Juncher

    2016-01-01

    Uncertainties related to fatigue damage estimation of non-linear systems are highly dependent on the tail behaviour and extreme values of the stress range distribution. By using a combination of the First Order Reliability Method (FORM) and Monte Carlo simulations (MCS), the accuracy of the fatigue...

  8. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  9. Improving Streamflow Simulation in Gaged and Ungaged Areas Using a Multi-Model Synthesis Combined with Remotely-Sensed Data and Estimates of Uncertainty

    Science.gov (United States)

    Lafontaine, J.; Hay, L.

    2015-12-01

    The United States Geological Survey (USGS) has developed a National Hydrologic Model (NHM) to support coordinated, comprehensive and consistent hydrologic model development, and facilitate the application of hydrologic simulations within the conterminous United States (CONUS). More than 1,700 gaged watersheds across the CONUS were modeled to test the feasibility of improving streamflow simulations in gaged and ungaged watersheds by linking statistically- and physically-based hydrologic models with remotely-sensed data products (i.e. - snow water equivalent) and estimates of uncertainty. Initially, the physically-based models were calibrated to measured streamflow data to provide a baseline for comparison. As many stream reaches in the CONUS are either not gaged, or are substantially impacted by water use or flow regulation, ancillary information must be used to determine reasonable parameter estimations for streamflow simulations. In addition, not all ancillary datasets are appropriate for application to all parts of the CONUS (e.g. - snow water equivalent in the southeastern U.S., where snow is a rarity). As it is not expected that any one data product or model simulation will be sufficient for representing hydrologic behavior across the entire CONUS, a systematic evaluation of which data products improve simulations of streamflow for various regions across the CONUS was performed. The resulting portfolio of calibration strategies can be used to guide selection of an appropriate combination of simulated and measured information for model development and calibration at a given location of interest. In addition, these calibration strategies have been developed to be flexible so that new data products or simulated information can be assimilated. This analysis provides a foundation to understand how well models work when streamflow data is either not available or is limited and could be used to further inform hydrologic model parameter development for ungaged areas.

  10. Uncertainty and sensitivity analysis in the scenario simulation with RELAP/SCDAP and MELCOR codes; Analisis de incertidumbre y sensibilidad en la simulacion de escenarios con los codigos RELAP/SCDAP y MELCOR

    Energy Technology Data Exchange (ETDEWEB)

    Garcia J, T.; Cardenas V, J., E-mail: tonatiuh.garcia@cnsns.gob.mx [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan 779, Col. Narvarte, 03020 Ciudad de Mexico (Mexico)

    2015-09-15

    A methodology was implemented for analysis of uncertainty in simulations of scenarios with RELAP/SCDAP V- 3.4 bi-7 and MELCOR V-2.1 codes, same that are used to perform safety analysis in the Comision Nacional de Seguridad Nuclear y Salvaguardias (CNSNS). The uncertainty analysis methodology chosen is a probabilistic method of type Propagation of uncertainty of the input parameters to the departure parameters. Therefore, it began with the selection of the input parameters considered uncertain and are considered of high importance in the scenario for its direct effect on the output interest variable. These parameters were randomly sampled according to intervals of variation or probability distribution functions assigned by expert judgment to generate a set of input files that were run through the simulation code to propagate the uncertainty to the output parameters. Then, through the use or ordered statistical and formula Wilks, was determined that the minimum number of executions required to obtain the uncertainty bands that include a population of 95% at a confidence level of 95% in the results is 93, is important to mention that in this method that number of executions does not depend on the number of selected input parameters. In the implementation routines in Fortran 90 that allowed automate the process to make the uncertainty analysis in transients for RELAP/SCDAP code were generated. In the case of MELCOR code for severe accident analysis, automation was carried out through complement Dakota Uncertainty incorporated into the Snap platform. To test the practical application of this methodology, two analyzes were performed: the first with the simulation of closing transient of the main steam isolation valves using the RELAP/SCDAP code obtaining the uncertainty band of the dome pressure of the vessel; while in the second analysis, the accident simulation of the power total loss (Sbo) was carried out with the Macarol code obtaining the uncertainty band for the

  11. Estimating extremes in climate change simulations using the peaks-over-threshold method with a non-stationary threshold

    Czech Academy of Sciences Publication Activity Database

    Kyselý, Jan; Picek, J.; Beranová, Romana

    2010-01-01

    Roč. 72, 1-2 (2010), s. 55-68 ISSN 0921-8181 R&D Projects: GA ČR GA205/06/1535; GA ČR GAP209/10/2045 Grant - others:GA MŠk(CZ) LC06024 Institutional research plan: CEZ:AV0Z30420517 Keywords : climate change * extreme value analysis * global climate models * peaks-over-threshold method * peaks-over-quantile regression * quantile regression * Poisson process * extreme temperatures Subject RIV: DG - Athmosphere Sciences, Meteorology Impact factor: 3.351, year: 2010

  12. Quantifying input uncertainty in an assemble-to-order system simulation with correlated input variables of mixed types

    NARCIS (Netherlands)

    Akçay, A.E.; Biller, B.

    2014-01-01

    We consider an assemble-to-order production system where the product demands and the time since the last customer arrival are not independent. The simulation of this system requires a multivariate input model that generates random input vectors with correlated discrete and continuous components. In

  13. Best estimate plus uncertainty analysis of departure from nucleate boiling limiting case with CASL core simulator VERA-CS in response to PWR main steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Brown, C.S., E-mail: csbrown3@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, 2500 Stinson Drive, Raleigh, NC 27695-7909 (United States); Zhang, H., E-mail: Hongbin.Zhang@inl.gov [Idaho National Laboratory, P.O. Box 1625, Idaho Falls, ID 83415-3870 (United States); Kucukboyaci, V., E-mail: kucukbvn@westinghouse.com [Westinghouse Electric Company, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States); Sung, Y., E-mail: sungy@westinghouse.com [Westinghouse Electric Company, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States)

    2016-12-01

    Highlights: • Best estimate plus uncertainty (BEPU) analyses of PWR core responses under main steam line break (MSLB) accident. • CASL’s coupled neutron transport/subchannel code VERA-CS. • Wilks’ nonparametric statistical method. • MDNBR 95/95 tolerance limit. - Abstract: VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was applied to simulate core behavior of a typical Westinghouse-designed 4-loop pressurized water reactor (PWR) with 17 × 17 fuel assemblies in response to two main steam line break (MSLB) accident scenarios initiated at hot zero power (HZP) at the end of the first fuel cycle with the most reactive rod cluster control assembly stuck out of the core. The reactor core boundary conditions at the most DNB limiting time step were determined by a system analysis code. The core inlet flow and temperature distributions were obtained from computational fluid dynamics (CFD) simulations. The two MSLB scenarios consisted of the high and low flow situations, where reactor coolant pumps either continue to operate with offsite power or do not continue to operate since offsite power is unavailable. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this demonstration of BEPU application, 59 full core simulations were performed for each accident scenario to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. A parametric goodness-of-fit approach was also applied to the results to obtain the MDNBR value at the 95/95 tolerance limit. Initial sensitivity analysis was performed with the 59 cases per accident scenario by use of Pearson correlation coefficients. The results show that this typical PWR core

  14. Production Planning with Respect to Uncertainties. Simulator Based Production Planning of Average Sized Combined Heat and Power Production Plants; Produktionsplanering under osaekerhet. Simulatorbaserad produktionsplanering av medelstora kraftvaermeanlaeggningar

    Energy Technology Data Exchange (ETDEWEB)

    Haeggstaahl, Daniel [Maelardalen Univ., Vaesteraas (Sweden); Dotzauer, Erik [AB Fortum, Stockholm (Sweden)

    2004-12-01

    temperature will affect the production of heat and power is performed. The conclusion is that the local energy company will benefit from using more sophisticated planning tools. The main sources of uncertainties in production planning are: weather, power price, fuel quality and availability of production units. Methodologies that handle uncertainties are discussed. The solution may be to use stochastic optimization or to do a scenario analysis. Simulator-based production planning seems very promising, because it is easy to maintain and change the process model in the graphical user interface. However, the prototype model presented has to be further developed to become a practical tool easy to use on daily basis in the control rooms.

  15. Best estimate plus uncertainty analysis of departure from nucleate boiling limiting case with CASL core simulator VERA-CS in response to PWR main steam line break event

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, H.; Kucukboyaci, V.; Sung, Y.

    2016-01-01

    Highlights: • Best estimate plus uncertainty (BEPU) analyses of PWR core responses under main steam line break (MSLB) accident. • CASL’s coupled neutron transport/subchannel code VERA-CS. • Wilks’ nonparametric statistical method. • MDNBR 95/95 tolerance limit. - Abstract: VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics subchannel code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). VERA-CS was applied to simulate core behavior of a typical Westinghouse-designed 4-loop pressurized water reactor (PWR) with 17 × 17 fuel assemblies in response to two main steam line break (MSLB) accident scenarios initiated at hot zero power (HZP) at the end of the first fuel cycle with the most reactive rod cluster control assembly stuck out of the core. The reactor core boundary conditions at the most DNB limiting time step were determined by a system analysis code. The core inlet flow and temperature distributions were obtained from computational fluid dynamics (CFD) simulations. The two MSLB scenarios consisted of the high and low flow situations, where reactor coolant pumps either continue to operate with offsite power or do not continue to operate since offsite power is unavailable. The best estimate plus uncertainty (BEPU) analysis method was applied using Wilks’ nonparametric statistical approach. In this demonstration of BEPU application, 59 full core simulations were performed for each accident scenario to provide the minimum departure from nucleate boiling ratio (MDNBR) at the 95/95 (95% probability with 95% confidence level) tolerance limit. A parametric goodness-of-fit approach was also applied to the results to obtain the MDNBR value at the 95/95 tolerance limit. Initial sensitivity analysis was performed with the 59 cases per accident scenario by use of Pearson correlation coefficients. The results show that this typical PWR core

  16. Effect of short-term subaerial exposure on the cauliflower coral, Pocillopora damicornis, during a simulated extreme low-tide event

    KAUST Repository

    Castrilló n-Cifuentes, Ana Lucia; Lozano-Corté s, Diego; Zapata, Fernando A.

    2017-01-01

    There is increased interest in understanding how stress reduces coral resistance to disturbances and how acclimatization increases the ability of corals to resist future stress. Most extreme low tides at Gorgona Island, which expose reef flats to air, do not appear to negatively affect corals because corals usually do not undergo lethal bleaching during such events. However, coral physiology and fitness may be impacted by this phenomenon. The aim of this study was to evaluate whether corals exposed to air have modified biological functions to resist bleaching. To test this, an extreme low-tide event was simulated in the field. Colonies of Pocillopora damicornis were exposed to air for 15 or 40 min over the course of one, two, or three consecutive days. This procedure was repeated for one to three months. Colonies of P. damicornis exposed to air had reduced fecundity, decreased zooxanthellae density, and changed color from darker to lighter. However, the growth rate of exposed corals was similar to that of non-exposed colonies. We conclude that short periods of subaerial exposure during extreme low tides are not lethal to P. damicornis, but negatively affect sexual reproduction, which might have deleterious effects at the population level. The periodic occurrence of extreme low tides in the tropical eastern Pacific may be one factor responsible for the high rate of asexual reproduction (e.g., fragmentation) in pocilloporid corals of this region.

  17. Effect of short-term subaerial exposure on the cauliflower coral, Pocillopora damicornis, during a simulated extreme low-tide event

    KAUST Repository

    Castrillón-Cifuentes, Ana Lucia

    2017-02-06

    There is increased interest in understanding how stress reduces coral resistance to disturbances and how acclimatization increases the ability of corals to resist future stress. Most extreme low tides at Gorgona Island, which expose reef flats to air, do not appear to negatively affect corals because corals usually do not undergo lethal bleaching during such events. However, coral physiology and fitness may be impacted by this phenomenon. The aim of this study was to evaluate whether corals exposed to air have modified biological functions to resist bleaching. To test this, an extreme low-tide event was simulated in the field. Colonies of Pocillopora damicornis were exposed to air for 15 or 40 min over the course of one, two, or three consecutive days. This procedure was repeated for one to three months. Colonies of P. damicornis exposed to air had reduced fecundity, decreased zooxanthellae density, and changed color from darker to lighter. However, the growth rate of exposed corals was similar to that of non-exposed colonies. We conclude that short periods of subaerial exposure during extreme low tides are not lethal to P. damicornis, but negatively affect sexual reproduction, which might have deleterious effects at the population level. The periodic occurrence of extreme low tides in the tropical eastern Pacific may be one factor responsible for the high rate of asexual reproduction (e.g., fragmentation) in pocilloporid corals of this region.

  18. Effect of short-term subaerial exposure on the cauliflower coral, Pocillopora damicornis, during a simulated extreme low-tide event

    Science.gov (United States)

    Castrillón-Cifuentes, Ana Lucia; Lozano-Cortés, Diego F.; Zapata, Fernando A.

    2017-06-01

    There is increased interest in understanding how stress reduces coral resistance to disturbances and how acclimatization increases the ability of corals to resist future stress. Most extreme low tides at Gorgona Island, which expose reef flats to air, do not appear to negatively affect corals because corals usually do not undergo lethal bleaching during such events. However, coral physiology and fitness may be impacted by this phenomenon. The aim of this study was to evaluate whether corals exposed to air have modified biological functions to resist bleaching. To test this, an extreme low-tide event was simulated in the field. Colonies of Pocillopora damicornis were exposed to air for 15 or 40 min over the course of one, two, or three consecutive days. This procedure was repeated for one to three months. Colonies of P. damicornis exposed to air had reduced fecundity, decreased zooxanthellae density, and changed color from darker to lighter. However, the growth rate of exposed corals was similar to that of non-exposed colonies. We conclude that short periods of subaerial exposure during extreme low tides are not lethal to P. damicornis, but negatively affect sexual reproduction, which might have deleterious effects at the population level. The periodic occurrence of extreme low tides in the tropical eastern Pacific may be one factor responsible for the high rate of asexual reproduction (e.g., fragmentation) in pocilloporid corals of this region.

  19. Joint simulation of stationary grade and non-stationary rock type for quantifying geological uncertainty in a copper deposit

    Science.gov (United States)

    Maleki, Mohammad; Emery, Xavier

    2017-12-01

    In mineral resources evaluation, the joint simulation of a quantitative variable, such as a metal grade, and a categorical variable, such as a rock type, is challenging when one wants to reproduce spatial trends of the rock type domains, a feature that makes a stationarity assumption questionable. To address this problem, this work presents methodological and practical proposals for jointly simulating a grade and a rock type, when the former is represented by the transform of a stationary Gaussian random field and the latter is obtained by truncating an intrinsic random field of order k with Gaussian generalized increments. The proposals concern both the inference of the model parameters and the construction of realizations conditioned to existing data. The main difficulty is the identification of the spatial correlation structure, for which a semi-automated algorithm is designed, based on a least squares fitting of the data-to-data indicator covariances and grade-indicator cross-covariances. The proposed models and algorithms are applied to jointly simulate the copper grade and the rock type in a Chilean porphyry copper deposit. The results show their ability to reproduce the gradual transitions of the grade when crossing a rock type boundary, as well as the spatial zonation of the rock type.

  20. Modeling the uncertainty of several VOC and its impact on simulated VOC and ozone in Houston, Texas

    Science.gov (United States)

    Pan, Shuai; Choi, Yunsoo; Roy, Anirban; Li, Xiangshang; Jeon, Wonbae; Souri, Amir Hossein

    2015-11-01

    A WRF-SMOKE-CMAQ modeling system was used to study Volatile Organic Compound (VOC) emissions and their impact on surface VOC and ozone concentrations in southeast Texas during September 2013. The model was evaluated against the ground-level Automated Gas Chromatograph (Auto-GC) measurement data from the Texas Commission on Environmental Quality (TCEQ). The comparisons indicated that the model over-predicted benzene, ethylene, toluene and xylene, while under-predicting isoprene and ethane. The mean biases between simulated and observed values of each VOC species showed clear daytime, nighttime, weekday and weekend variations. Adjusting the VOC emissions using simulated/observed ratios improved model performance of each VOC species, especially mitigating the mean bias substantially. Simulated monthly mean ozone showed a minor change: a 0.4 ppb or 1.2% increase; while a change of more than 5 ppb was seen in hourly ozone data on high ozone days, this change moved model predictions closer to observations. The CMAQ model run with the adjusted emissions better reproduced the variability in the National Aeronautics and Space Administration (NASA)'s Ozone Monitoring Instrument (OMI) formaldehyde (HCHO) columns. The adjusted model scenario also slightly better reproduced the aircraft HCHO concentrations from NASA's DISCOVER-AQ campaign conducted during the simulation episode period; Correlation, Mean Bias and RMSE improved from 0.34, 1.38 ppb and 2.15 ppb to 0.38, 1.33 ppb and 2.08 ppb respectively. A process analysis conducted for both industrial/urban and rural areas suggested that chemistry was the main process contributing to ozone production in both areas, while the impact of chemistry was smaller in rural areas than in industrial and urban areas. For both areas, the positive chemistry contribution increased in the sensitivity simulation largely due to the increase in emissions. Nudging VOC emissions to match the observed concentrations shifted the ozone hotspots

  1. On the estimation of stellar parameters with uncertainty prediction from Generative Artificial Neural Networks: application to Gaia RVS simulated spectra

    Science.gov (United States)

    Dafonte, C.; Fustes, D.; Manteiga, M.; Garabato, D.; Álvarez, M. A.; Ulla, A.; Allende Prieto, C.

    2016-10-01

    Aims: We present an innovative artificial neural network (ANN) architecture, called Generative ANN (GANN), that computes the forward model, that is it learns the function that relates the unknown outputs (stellar atmospheric parameters, in this case) to the given inputs (spectra). Such a model can be integrated in a Bayesian framework to estimate the posterior distribution of the outputs. Methods: The architecture of the GANN follows the same scheme as a normal ANN, but with the inputs and outputs inverted. We train the network with the set of atmospheric parameters (Teff, log g, [Fe/H] and [α/ Fe]), obtaining the stellar spectra for such inputs. The residuals between the spectra in the grid and the estimated spectra are minimized using a validation dataset to keep solutions as general as possible. Results: The performance of both conventional ANNs and GANNs to estimate the stellar parameters as a function of the star brightness is presented and compared for different Galactic populations. GANNs provide significantly improved parameterizations for early and intermediate spectral types with rich and intermediate metallicities. The behaviour of both algorithms is very similar for our sample of late-type stars, obtaining residuals in the derivation of [Fe/H] and [α/ Fe] below 0.1 dex for stars with Gaia magnitude Grvs satellite. Conclusions: Uncertainty estimation of computed astrophysical parameters is crucial for the validation of the parameterization itself and for the subsequent exploitation by the astronomical community. GANNs produce not only the parameters for a given spectrum, but a goodness-of-fit between the observed spectrum and the predicted one for a given set of parameters. Moreover, they allow us to obtain the full posterior distribution over the astrophysical parameters space once a noise model is assumed. This can be used for novelty detection and quality assessment.

  2. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  3. Shall we upgrade one-dimensional secondary settler models used in WWTP simulators? - An assessment of model structure uncertainty and its propagation.

    Science.gov (United States)

    Plósz, Benedek Gy; De Clercq, Jeriffa; Nopens, Ingmar; Benedetti, Lorenzo; Vanrolleghem, Peter A

    2011-01-01

    In WWTP models, the accurate assessment of solids inventory in bioreactors equipped with solid-liquid separators, mostly described using one-dimensional (1-D) secondary settling tank (SST) models, is the most fundamental requirement of any calibration procedure. Scientific knowledge on characterising particulate organics in wastewater and on bacteria growth is well-established, whereas 1-D SST models and their impact on biomass concentration predictions are still poorly understood. A rigorous assessment of two 1-DSST models is thus presented: one based on hyperbolic (the widely used Takács-model) and one based on parabolic (the more recently presented Plósz-model) partial differential equations. The former model, using numerical approximation to yield realistic behaviour, is currently the most widely used by wastewater treatment process modellers. The latter is a convection-dispersion model that is solved in a numerically sound way. First, the explicit dispersion in the convection-dispersion model and the numerical dispersion for both SST models are calculated. Second, simulation results of effluent suspended solids concentration (XTSS,Eff), sludge recirculation stream (XTSS,RAS) and sludge blanket height (SBH) are used to demonstrate the distinct behaviour of the models. A thorough scenario analysis is carried out using SST feed flow rate, solids concentration, and overflow rate as degrees of freedom, spanning a broad loading spectrum. A comparison between the measurements and the simulation results demonstrates a considerably improved 1-D model realism using the convection-dispersion model in terms of SBH, XTSS,RAS and XTSS,Eff. Third, to assess the propagation of uncertainty derived from settler model structure to the biokinetic model, the impact of the SST model as sub-model in a plant-wide model on the general model performance is evaluated. A long-term simulation of a bulking event is conducted that spans temperature evolution throughout a summer

  4. Simulation Study of Performance of Active Ceilings with Phase Change Material in Office Buildings under Extreme Climate Conditions

    DEFF Research Database (Denmark)

    Stefansen, Casper; Farhan, Hajan; Bourdakis, Eleftherios

    2018-01-01

    simulations were run with a building simulation software for eight climates. The chosen climates were Dubai –UAE, Istanbul – Turkey, Lima – Peru, Moscow – Russia, Nuuk – Greenland, Salvador – Brazil, Tokyo – Japan and Tromsø – Norway. Two models of a two-person office were made for each climate; one model...

  5. Wind and Wave Setup Contributions to Extreme Sea Levels at a Tropical High Island: A Stochastic Cyclone Simulation Study for Apia, Samoa

    Directory of Open Access Journals (Sweden)

    Ron Karl Hoeke

    2015-09-01

    Full Text Available Wind-wave contributions to tropical cyclone (TC-induced extreme sea levels are known to be significant in areas with narrow littoral zones, particularly at oceanic islands. Despite this, little information exists in many of these locations to assess the likelihood of inundation, the relative contribution of wind and wave setup to this inundation, and how it may change with sea level rise (SLR, particularly at scales relevant to coastal infrastructure. In this study, we explore TC-induced extreme sea levels at spatial scales on the order of tens of meters at Apia, the capitol of Samoa, a nation in the tropical South Pacific with typical high-island fringing reef morphology. Ensembles of stochastically generated TCs (based on historical information are combined with numerical simulations of wind waves, storm-surge, and wave setup to develop high-resolution statistical information on extreme sea levels and local contributions of wind setup and wave setup. The results indicate that storm track and local morphological details lead to local differences in extreme sea levels on the order of 1 m at spatial scales of less than 1 km. Wave setup is the overall largest contributor at most locations; however, wind setup may exceed wave setup in some sheltered bays. When an arbitrary SLR scenario (+1 m is introduced, overall extreme sea levels are found to modestly decrease relative to SLR, but wave energy near the shoreline greatly increases, consistent with a number of other recent studies. These differences have implications for coastal adaptation strategies.

  6. Quantifying the role of climate variability on extreme total water level impacts: An application of a full simulation model to Ocean Beach, California

    Science.gov (United States)

    Serafin, K.; Ruggiero, P.; Stockdon, H. F.; Barnard, P.; Long, J.

    2014-12-01

    Many coastal communities worldwide are vulnerable to flooding and erosion driven by extreme total water levels (TWL), potentially dangerous events produced by the combination of large waves, high tides, and high non-tidal residuals. The West coast of the United States provides an especially challenging environment to model these processes due to its complex geological setting combined with uncertain forecasts for sea level rise (SLR), changes in storminess, and possible changes in the frequency of major El Niños. Our research therefore aims to develop an appropriate methodology to assess present-day and future storm-induced coastal hazards along the entire U.S. West coast, filling this information gap. We present the application of this framework in a pilot study at Ocean Beach, California, a National Park site within the Golden Gate National Recreation Area where existing event-scale coastal change data can be used for model calibration and verification. We use a probabilistic, full simulation TWL model (TWL-FSM; Serafin and Ruggiero, in press) that captures the seasonal and interannual climatic variability in extremes using functions of regional climate indices, such as the Multivariate ENSO index (MEI), to represent atmospheric patterns related to the El Niño-Southern Oscillation (ENSO). In order to characterize the effect of climate variability on TWL components, we refine the TWL-FSM by splitting non-tidal residuals into low (monthly mean sea level anomalies) and high frequency (storm surge) components. We also develop synthetic climate indices using Markov sequences to reproduce the autocorrelated nature of ENSO behavior. With the refined TWL-FSM, we simulate each TWL component, resulting in synthetic TWL records providing robust estimates of extreme return level events (e.g., the 100-yr event) and the ability to examine the relative contribution of each TWL component to these extreme events. Extreme return levels are then used to drive storm impact models

  7. Design of a Passive Exoskeleton for the Upper Extremity through Co-simulation with a Biomechanical Human Arm Model

    DEFF Research Database (Denmark)

    Zhou, Lelai; Bai, Shaoping; Rasmussen, John

    2013-01-01

    An approach of designing exoskeletons on the basis of simulation of the exoskeleton and a human body model is proposed in this paper. The new approach, addressing the problem of physical human-exoskeleton interactions, models and simulates the mechanics for both the exoskeleton and the human body......, which allows designers to analyze and evaluate an exoskeleton for its functioning, effectively. A simulation platform is developed by integrating a biomechanical model of human body and the exoskeleton. With the proposed approach, two types of exoskeletons with gravity compensating capability...

  8. Simulation of an extreme heavy rainfall event over Chennai, India using WRF: Sensitivity to grid resolution and boundary layer physics

    KAUST Repository

    Srinivas, C.V.

    2018-05-04

    In this study, the heavy precipitation event on 01 December 2015 over Chennai located on the southeast coast of India was simulated using the Weather Research and Forecast (WRF) model. A series of simulations were conducted using explicit convection and varying the planetary boundary layer (PBL) parameterization schemes. The model results were compared with available surface, satellite and Doppler Weather Radar observations. Simulations indicate strong, sustained moist convection associated with development of a mesoscale upper air cyclonic circulation, during the passage of a synoptic scale low-pressure trough caused heavy rainfall over Chennai and its surroundings. Results suggest that veering of wind with height associated with strong wind shear in the layer 800–400 hPa together with dry air advection facilitated development of instability and initiation of convection. The 1-km domain using explicit convection improved the prediction of rainfall intensity of about 450 mm and its distribution. The PBL physics strongly influenced the rainfall prediction by changing the location of upper air circulation, energy transport, moisture convergence and intensity of convection in the schemes YSU, MYJ and MYNN. All the simulations underestimated the first spell of the heavy rainfall. While YSU and MYJ schemes grossly underestimated the rainfall and dislocated the area of maximum rainfall, the higher order MYNN scheme simulated the rainfall pattern in better agreement with observations. The MYNN showed lesser mixing and simulated more humid boundary layer, higher convective available potential energy (CAPE) and stronger winds at mid-troposphere than did the other schemes. The MYNN also realistically simulated the location of upper air cyclonic flow and various dynamic and thermodynamic features. Consequently it simulated stronger moisture convergence and higher precipitation.

  9. Simulation of an extreme heavy rainfall event over Chennai, India using WRF: Sensitivity to grid resolution and boundary layer physics

    KAUST Repository

    Srinivas, C.V.; Yesubabu, V.; Hari Prasad, D.; Hari Prasad, K.B.R.R.; Greeshma, M.M.; Baskaran, R.; Venkatraman, B.

    2018-01-01

    In this study, the heavy precipitation event on 01 December 2015 over Chennai located on the southeast coast of India was simulated using the Weather Research and Forecast (WRF) model. A series of simulations were conducted using explicit convection and varying the planetary boundary layer (PBL) parameterization schemes. The model results were compared with available surface, satellite and Doppler Weather Radar observations. Simulations indicate strong, sustained moist convection associated with development of a mesoscale upper air cyclonic circulation, during the passage of a synoptic scale low-pressure trough caused heavy rainfall over Chennai and its surroundings. Results suggest that veering of wind with height associated with strong wind shear in the layer 800–400 hPa together with dry air advection facilitated development of instability and initiation of convection. The 1-km domain using explicit convection improved the prediction of rainfall intensity of about 450 mm and its distribution. The PBL physics strongly influenced the rainfall prediction by changing the location of upper air circulation, energy transport, moisture convergence and intensity of convection in the schemes YSU, MYJ and MYNN. All the simulations underestimated the first spell of the heavy rainfall. While YSU and MYJ schemes grossly underestimated the rainfall and dislocated the area of maximum rainfall, the higher order MYNN scheme simulated the rainfall pattern in better agreement with observations. The MYNN showed lesser mixing and simulated more humid boundary layer, higher convective available potential energy (CAPE) and stronger winds at mid-troposphere than did the other schemes. The MYNN also realistically simulated the location of upper air cyclonic flow and various dynamic and thermodynamic features. Consequently it simulated stronger moisture convergence and higher precipitation.

  10. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    DEFF Research Database (Denmark)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-01-01

    uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver......There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation...... to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from...

  11. Accessing the capability of TRMM 3B42 V7 to simulate streamflow during extreme rain events: Case study for a Himalayan River Basin

    Science.gov (United States)

    Kumar, Brijesh; Lakshmi, Venkat

    2018-03-01

    The paper examines the quality of Tropical Rainfall Monitoring Mission (TRMM) 3B42 V7 precipitation product to simulate the streamflow using Soil Water Assessment Tool (SWAT) model for various rainfall intensities over the Himalayan region. The SWAT model has been set up for Gandak River Basin with 41 sub-basins and 420 HRUs. Five stream gauge locations are used to simulate the streamflow for a time span of 10 years (2000-2010). Daily streamflow for the simulation period is collected from Central Water Commission (CWC), India and Department of Hydrology and Meteorology (DHM), Nepal. The simulation results are found good in terms of Nash-Sutcliffe efficiency (NSE) {>}0.65, coefficient of determination (R2) {>}0.67 and Percentage Bias (PBIAS){}124.4 mm/d). The PBIAS and RSR show that TRMM simulated streamflow is suitable for moderate to heavy rainfall intensities. However, it does not perform well for light- and extremely-heavy rainfall intensities. The finding of the present work is useful for the problems related to water resources management, irrigation planning and hazard analysis over the Himalayan regions.

  12. A EU simulation platform for nuclear reactor safety: multi-scale and multi-physics calculations, sensitivity and uncertainty analysis (NURESIM project)

    International Nuclear Information System (INIS)

    Chauliac, Christian; Bestion, Dominique; Crouzet, Nicolas; Aragones, Jose-Maria; Cacuci, Dan Gabriel; Weiss, Frank-Peter; Zimmermann, Martin A.

    2010-01-01

    The NURESIM project, the numerical simulation platform, is developed in the frame of the NURISP European Collaborative Project (FP7), which includes 22 organizations from 14 European countries. NURESIM intends to be a reference platform providing high quality software tools, physical models, generic functions and assessment results. The NURESIM platform provides an accurate representation of the physical phenomena by promoting and incorporating the latest advances in core physics, two-phase thermal-hydraulics and fuel modelling. It includes multi-scale and multi-physics features, especially for coupling core physics and thermal-hydraulics models for reactor safety. Easy coupling of the different codes and solvers is provided through the use of a common data structure and generic functions (e.g., for interpolation between non-conforming meshes). More generally, the platform includes generic pre-processing, post-processing and supervision functions through the open-source SALOME software, in order to make the codes more user-friendly. The platform also provides the informatics environment for testing and comparing different codes. The contribution summarizes the achievements and ongoing developments of the simulation platform in core physics, thermal-hydraulics, multi-physics, uncertainties and code integration

  13. Uncertainties in United States agricultural N2O emissions: comparing forward model simulations to atmospheric N2O data.

    Science.gov (United States)

    Nevison, C. D.; Saikawa, E.; Dlugokencky, E. J.; Andrews, A. E.; Sweeney, C.

    2014-12-01

    Atmospheric N2O concentrations have increased from 275 ppb in the preindustrial to about 325 ppb in recent years, a ~20% increase with important implications for both anthropogenic greenhouse forcing and stratospheric ozone recovery. This increase has been driven largely by synthetic fertilizer production and other perturbations to the global nitrogen cycle associated with human agriculture. Several recent regional atmospheric inversion studies have quantified North American agricultural N2O emissions using top-down constraints based on atmospheric N2O data from the National Oceanic and Atmospheric Administration (NOAA) Global Greenhouse Gas Reference Network, including surface, aircraft and tall tower platforms. These studies have concluded that global N2O inventories such as EDGAR may be underestimating the true U.S. anthropogenic N2O source by a factor of 3 or more. However, simple back-of-the-envelope calculations show that emissions of this magnitude are difficult to reconcile with the basic constraints of the global N2O budget. Here, we explore some possible reasons why regional atmospheric inversions might overestimate the U.S. agricultural N2O source. First, the seasonality of N2O agricultural sources is not well known, but can have an important influence on inversion results, particularly when the inversions are based on data that are concentrated in the spring/summer growing season. Second, boundary conditions can strongly influence regional inversions but the boundary conditions used may not adequately account for remote influences on surface data such as the seasonal stratospheric influx of N2O-depleted air. We will present a set of forward model simulations, using the Community Land Model (CLM) and two atmospheric chemistry tracer transport models, MOZART and the Whole Atmosphere Community Climate Model (WACCM), that examine the influence of terrestrial emissions and atmospheric chemistry and dynamics on atmospheric variability in N2O at U.S. and

  14. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    Science.gov (United States)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  15. Adjoint-Based Uncertainty Quantification with MCNP

    Energy Technology Data Exchange (ETDEWEB)

    Seifried, Jeffrey E. [Univ. of California, Berkeley, CA (United States)

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  16. Teaching Uncertainties

    Science.gov (United States)

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  17. Calibration uncertainty

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Anglov, Thomas

    2002-01-01

    Methods recommended by the International Standardization Organisation and Eurachem are not satisfactory for the correct estimation of calibration uncertainty. A novel approach is introduced and tested on actual calibration data for the determination of Pb by ICP-AES. The improved calibration...

  18. Response to Extreme Temperatures of Mesoporous Silica MCM-41: Porous Structure Transformation Simulation and Modification of Gas Adsorption Properties.

    Science.gov (United States)

    Zhang, Shenli; Perez-Page, Maria; Guan, Kelly; Yu, Erick; Tringe, Joseph; Castro, Ricardo H R; Faller, Roland; Stroeve, Pieter

    2016-11-08

    Molecular dynamics (MD) and Monte Carlo (MC) simulations were applied together for the first time to reveal the porous structure transformation mechanisms of mesoporous silica MCM-41 subjected to temperatures up to 2885 K. Silica was experimentally characterized to inform the models and enable prediction of changes in gas adsorption/separation properties. MD simulations suggest that the pore closure process is activated by a collective diffusion of matrix atoms into the porous region, accompanied by bond reformation at the surface. Degradation is kinetically limited, such that complete pore closure is postponed at high heating rates. We experimentally observe decreased gas adsorption with increasing temperature in mesoporous silica heated at fixed rates, due to pore closure and structural degradation consistent with simulation predictions. Applying the Kissinger equation, we find a strong correlation between the simulated pore collapse temperatures and the experimental values which implies an activation energy of 416 ± 17 kJ/mol for pore closure. MC simulations give the adsorption and selectivity for thermally treated MCM-41, for N 2 , Ar, Kr, and Xe at room temperature within the 1-10 000 kPa pressure range. Relative to pristine MCM-41, we observe that increased surface roughness due to decreasing pore size amplifies the difference of the absolute adsorption amount differently for different adsorbate molecules. In particular, we find that adsorption of strongly interacting molecules can be enhanced in the low-pressure region while adsorption of weakly interacting molecules is inhibited. This then results in higher selectivity in binary mixture adsorption in mesoporous silica.

  19. Mandelbrot's Extremism

    NARCIS (Netherlands)

    Beirlant, J.; Schoutens, W.; Segers, J.J.J.

    2004-01-01

    In the sixties Mandelbrot already showed that extreme price swings are more likely than some of us think or incorporate in our models.A modern toolbox for analyzing such rare events can be found in the field of extreme value theory.At the core of extreme value theory lies the modelling of maxima

  20. Mean versus extreme climate in the Mediterranean region and its sensitivity to future global warming conditions

    Energy Technology Data Exchange (ETDEWEB)

    Paeth, H.; Hense, A. [Meteorological Inst., Univ. Bonn (Germany)

    2005-06-01

    The Mediterranean region (MTR) has been supposed to be very sensitive to changes in land surface and atmospheric greenhouse-gas (GHG) concentrations. Particularly, an intensification of climate extremes may be associated with severe socio-economic implications. Here, we present an analysis of climate mean and extreme conditions in this subtropical area based on regional climate model experiments, simulating the present-day and possible future climate. The analysis of extreme values (EVs) is based on the assumption that the extremes of daily precipitation and near-surface temperature are well fitted by the Generalized Pareto distribution (GPD). Return values of extreme daily events are determined using the method of L-moments. Particular emphasis is laid on the evaluation of the return values with respect to the uncertainty range of the estimate as derived from a Monte Carlo sampling approach. During the most recent 25 years the MTR has become dryer in spring but more humid especially in the western part in autumn and winter. At the same time, the whole region has been subject to a substantial warming. The strongest rainfall extremes are simulated in autumn over the Mediterranean Sea around Italy. Temperature extremes are most pronounced over the land masses, especially over northern Africa. Given the large uncertainty of the EV estimate, only 1-year return values are further analysed. During recent decades, statistically significant changes in extremes are only found for temperature. Future climate conditions may come along with a decrease in mean and extreme precipitation during the cold season, whereas an intensification of the hydrological cycle is predicted in summer and autumn. Temperature is predominantly affected over the Iberian Peninsula and the eastern part of the MTR. In many grid boxes, the signals are blurred out due to the large amount of uncertainty in the EV estimate. Thus, a careful analysis is required when making inferences about the future

  1. Spatial extreme learning machines: An application on prediction of disease counts.

    Science.gov (United States)

    Prates, Marcos O

    2018-01-01

    Extreme learning machines have gained a lot of attention by the machine learning community because of its interesting properties and computational advantages. With the increase in collection of information nowadays, many sources of data have missing information making statistical analysis harder or unfeasible. In this paper, we present a new model, coined spatial extreme learning machine, that combine spatial modeling with extreme learning machines keeping the nice properties of both methodologies and making it very flexible and robust. As explained throughout the text, the spatial extreme learning machines have many advantages in comparison with the traditional extreme learning machines. By a simulation study and a real data analysis we present how the spatial extreme learning machine can be used to improve imputation of missing data and uncertainty prediction estimation.

  2. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  3. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    Science.gov (United States)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a

  4. Simulation of extreme ground water flow in the fractal crack structure of Earth's crust - impact on catastrophic floods

    Science.gov (United States)

    Bukharov, Dmitriy; Aleksey, Kucherik; Tatyana, Trifonova

    2014-05-01

    Recently, the contribution of groundwater in catastrophic floods is the question under discussion [1,2]. The principal problem in such an approach - to analyze the transportation ways for groundwater in dynamics, and especially - the reasons of exit it on land surface. The crackness, being a characteristic property for all rocks, should be associated with the process in respect of unified dynamic system as a river water basin is, taking into account fundamental phenomena of the 3D-crack network development/modification (up to faults) as a transport groundwater system [3]. 2. In the system of fractal cracks (connected with the main channel for groundwater) the formation of extreme flow is possible, i.e. a devastating case occurs by instantaneous flash mechanism. The development of such a process is related to two factors. First, within the main channel of propagation of the groundwater when a motion is turbulent. In accordance with the theory of Kolmogorov [4], we assume that such a turbulence is isotropic. The fact means that both velocity and pressure fields in the water flow have pulsations related to the non-linear energy transfer between the vortices. This approach allows us to determine both that a maximum possible size of the vortices defined by characteristic dimensions of the underground channel and another - a minimum size of their due to process of dissipation. Energy transfer in the eddies formed near a border, is a complex nonlinear process, which we described by using a modernized Prandtl semi-empirical model [5]. Second, the mechanism of groundwater propagation in the system of cracks extending from the main underground channel is described in the frames of the fractal geometry methods [6]. The approach allows to determine the degree of similarity in the crack system, i.e. the ratio of mean diameters and lengths of cracks/faults for each step of decomposition. The fact results in integrated quantitative characteristics of 3D-network in all, by fractal

  5. In Situ Raman Spectral Characteristics of Carbon Dioxide in a Deep-Sea Simulator of Extreme Environments Reaching 300 ℃ and 30 MPa.

    Science.gov (United States)

    Li, Lianfu; Du, Zengfeng; Zhang, Xin; Xi, Shichuan; Wang, Bing; Luan, Zhendong; Lian, Chao; Yan, Jun

    2018-01-01

    Deep-sea carbon dioxide (CO 2 ) plays a significant role in the global carbon cycle and directly affects the living environment of marine organisms. In situ Raman detection technology is an effective approach to study the behavior of deep-sea CO 2 . However, the Raman spectral characteristics of CO 2 can be affected by the environment, thus restricting the phase identification and quantitative analysis of CO 2 . In order to study the Raman spectral characteristics of CO 2 in extreme environments (up to 300 ℃ and 30 MPa), which cover most regions of hydrothermal vents and cold seeps around the world, a deep-sea extreme environment simulator was developed. The Raman spectra of CO 2 in different phases were obtained with Raman insertion probe (RiP) system, which was also used in in situ Raman detection in the deep sea carried by remotely operated vehicle (ROV) "Faxian". The Raman frequency shifts and bandwidths of gaseous, liquid, solid, and supercritical CO 2 and the CO 2 -H 2 O system were determined with the simulator. In our experiments (0-300 ℃ and 0-30 MPa), the peak positions of the symmetric stretching modes of gaseous CO 2, liquid CO 2 , and supercritical CO 2 shift approximately 0.6 cm -1 (1387.8-1388.4 cm -1 ), 0.7 cm -1 (1385.5-1386.2 cm -1 ), and 2.5 cm -1 (1385.7-1388.2 cm -1 ), and those of the bending modes shift about 1.0 cm -1 (1284.7-1285.7 cm -1 ), 1.9 cm -1 (1280.1-1282.0 cm -1 ), and 4.4 cm -1 (1281.0-1285.4 cm -1 ), respectively. The Raman spectral characteristics of the CO 2 -H 2 O system were also studied under the same conditions. The peak positions of dissolved CO 2 varied approximately 4.5 cm -1 (1282.5-1287.0 cm -1 ) and 2.4 cm -1 (1274.4-1276.8 cm -1 ) for each peak. In comparison with our experiment results, the phases of CO 2 in extreme conditions (0-3000 m and 0-300 ℃) can be identified with the Raman spectra collected in situ. This qualitative research on CO 2 can also support the

  6. Simulated Extreme Prepitation Indices over Northeast Brasil in Current Climate and Future Scenarios RCP4.5 and RCP8.5

    Science.gov (United States)

    Wender Santiago Marinho, Marcos; Araújo Costa, Alexandre; Cassain Sales, Domingo; Oliveira Guimarães, Sullyandro; Mariano da Silva, Emerson; das Chagas Vasconcelos Júnior, Francisco

    2013-04-01

    In this study, we analyzed extreme precipitation indices, for present and future modeled climates over Northeast of Brazil (NEB), from CORDEX simulations over the domain of Tropical Americas. The period for the model validation was from 1989-2007, using data from the European Center (ECWMF) Reanalysis, ERA-INTERIM, as input to drive the regional model (RAMS 6.0). Reanalysis data were assimilated via both lateral boundaries and the entire domain (a much weaker "central nudging"). Six indices of extreme precipitation were calculated over NEB: the average number of days above 10, 20 and 30 mm in one year (R10, R20, R30), the number of consecutive dry days (CDD), the number of consecutive wet days (CWD) and the maximum rainfall in five consecutive days (RX5). Those indices were compared against two independent databases: MERRA (Modern Era Retrospective analysis for Research and Applications) and TRMM (Tropical Rainfall Measuring Mission). After validation, climate simulations were performed for the present climate (1985-2005) and short-term (2015-2035), mid-term (2045-2065) and long-term (2079 to 2099) future climates for two scenarios: RCP 4.5 and RCP 8.5, nesting RAMS into HadGEM2-ES global model (a participant of CMIP5). Along with the indices, we also calculated Probability Distribution Functions (PDFs) to study the behavior of daily precipitation in the present and by the end of the 21st century (2079 to 2099) to assess possible changes under RCPs 4.5 and 8.5. The regional model is capable of representing relatively well the extreme precipitation indices for current climate, but there is some difficulties in performing a proper validation since the observed databases disagree significantly. Future projections show significant changes in most extreme indices. Rnn generally tend to increase, especially under RCP8.5. More significant changes are projected for the long-term period, under RCP8.5, which shows a pronounced R30 enhancement over northern states. CDD tends

  7. Sensitivity Analysis of Expected Wind Extremes over the Northwestern Sahara and High Atlas Region.

    Science.gov (United States)

    Garcia-Bustamante, E.; González-Rouco, F. J.; Navarro, J.

    2017-12-01

    A robust statistical framework in the scientific literature allows for the estimation of probabilities of occurrence of severe wind speeds and wind gusts, but does not prevent however from large uncertainties associated with the particular numerical estimates. An analysis of such uncertainties is thus required. A large portion of this uncertainty arises from the fact that historical observations are inherently shorter that the timescales of interest for the analysis of return periods. Additional uncertainties stem from the different choices of probability distributions and other aspects related to methodological issues or physical processes involved. The present study is focused on historical observations over the Ouarzazate Valley (Morocco) and in a high-resolution regional simulation of the wind in the area of interest. The aim is to provide extreme wind speed and wind gust return values and confidence ranges based on a systematic sampling of the uncertainty space for return periods up to 120 years.

  8. Reducing the uncertainty of parameters controlling seasonal carbon and water fluxes in Chinese forests and its implication for simulated climate sensitivities

    Science.gov (United States)

    Li, Yue; Yang, Hui; Wang, Tao; MacBean, Natasha; Bacour, Cédric; Ciais, Philippe; Zhang, Yiping; Zhou, Guangsheng; Piao, Shilong

    2017-08-01

    Reducing parameter uncertainty of process-based terrestrial ecosystem models (TEMs) is one of the primary targets for accurately estimating carbon budgets and predicting ecosystem responses to climate change. However, parameters in TEMs are rarely constrained by observations from Chinese forest ecosystems, which are important carbon sink over the northern hemispheric land. In this study, eddy covariance data from six forest sites in China are used to optimize parameters of the ORganizing Carbon and Hydrology In Dynamics EcosystEms TEM. The model-data assimilation through parameter optimization largely reduces the prior model errors and improves the simulated seasonal cycle and summer diurnal cycle of net ecosystem exchange, latent heat fluxes, and gross primary production and ecosystem respiration. Climate change experiments based on the optimized model are deployed to indicate that forest net primary production (NPP) is suppressed in response to warming in the southern China but stimulated in the northeastern China. Altered precipitation has an asymmetric impact on forest NPP at sites in water-limited regions, with the optimization-induced reduction in response of NPP to precipitation decline being as large as 61% at a deciduous broadleaf forest site. We find that seasonal optimization alters forest carbon cycle responses to environmental change, with the parameter optimization consistently reducing the simulated positive response of heterotrophic respiration to warming. Evaluations from independent observations suggest that improving model structure still matters most for long-term carbon stock and its changes, in particular, nutrient- and age-related changes of photosynthetic rates, carbon allocation, and tree mortality.

  9. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  10. POLCA-T simulation of OECD/NRC BWR turbine trip benchmark exercise 3 best estimate scenario TT2 test and four extreme scenarios

    International Nuclear Information System (INIS)

    Panayotov, D.

    2004-01-01

    Westinghouse transient code POLCA-T brings together the system thermal-hydraulics plant models and the 3D neutron kinetics core model. Code validation plan includes the calculations of Peach Bottom end of cycle 2 turbine trip transients and low-flow stability tests. The paper describes the objectives, method, and results of analyses performed in the final phase of OECD/NRC Peach Bottom 2 Boiling Water Reactor Turbine Trip Benchmark. Brief overview of the code features, the method of simulation, the developed 3D core model and system input deck for Peach Bottom 2 are given. The paper presents the results of benchmark exercise 3 best estimate scenario: coupled 3D core neutron kinetics with system thermal-hydraulics analyses. Performed sensitivity studies cover the SCRAM initiation, carry-under, and decay power. Obtained results including total power, steam dome, core exit, lower and upper plenum, main steam line and turbine inlet pressures showed good agreement with measured plant data Thus the POLCA-T code capabilities for correct simulation of turbine trip transients were proved The performed calculations and obtained results for extreme cases demonstrate the POLCA-T code wide range capabilities to simulate transients when scram, steam bypass, and safety and relief valves are not activated. The code is able to handle such transients even when the reactor power and pressure reach values higher than 600 % of rated power, and 10.8 MPa. (authors)

  11. Streamline three-dimensional thermal model of a lithium titanate pouch cell battery in extreme temperature conditions with module simulation

    Science.gov (United States)

    Jaguemont, Joris; Omar, Noshin; Martel, François; Van den Bossche, Peter; Van Mierlo, Joeri

    2017-11-01

    In this paper, the development of a three-dimensional (3D) lithium